Planning, Personalization, and the Spatial Web: Applying AI to the Home
I. Introduction
The home is a primary organizer of our lives. Every decision about layout, furniture, lighting, or material affects how we move, rest, work, and relate to one another. Despite its centrality, the home remains one of the most opaque, underserved, and fragmented domains in the digital landscape. We navigate the design and management of domestic space through disconnected tools: search engines, design software, spreadsheets, ecommerce platforms, and personal intuition. None of these systems understand the home as a space, nor the person as a changing agent within it. They lack memory. They lack context. They lack vision.
This white paper introduces a different approach: one grounded in spatial intelligence, powered by a new kind of model — a Large World Model (LWM) for Domestic Life. This system goes beyond the capabilities of traditional AI by representing both language and images as well as places. It creates living models of real homes that evolve with the people who inhabit them. It sees the house as both geometry and as a platform for planning, expression, and care.
To do this, the LWM integrates multiple layers of information: computer vision for object and layout recognition, dynamic spatial twins for modeling environments, member profiles for taste and constraint, and generative systems for proposing changes, additions, or interventions. These tools work together as an orchestrated system — enabling continuous collaboration between people, places, and things. The platform that emerges can accomplish something as simple as suggesting a chair or a paint color while also being able to help members understand what’s possible in their space, what’s compatible with their lives, and what’s meaningful to their values. It is a system for seeing, planning, and deciding — for translating lived experience into structured design intelligence.
In the pages that follow, we’ll explore how this system works: how it interprets the visual world, constructs spatial knowledge, personalizes environments, and proposes actionable paths forward. We’ll also examine how the LWM differs from other forms of artificial intelligence — and what it tells us about the future of planning, personalization, and the home.
II. Understanding Space: From Vision Models to Spatial Twins
Before a home can be planned, designed, or improved, it must be understood. This is the starting point of the Large World Model: a rich and evolving comprehension of space as it is — as a lived environment full of real constraints, real objects, and real potential. Traditional design software begins with a floor plan or CAD model, assuming clean inputs and expert operators. The LWM begins with the world as it is encountered: photos, videos, scans, measurements, lists, and memories — the kinds of materials any household can provide. From these fragments, the model builds a cohesive understanding of the space using advanced computer vision and multimodal reasoning.
The LWM's vision model interprets visual data in layers in order to derive meaning from images. It identifies objects like "sofa" or "lamp," and also classifies materials, detects scale and proportions, infers affordances, and even estimates wear or condition. Depth estimation and 3D reconstruction methods translate flat images into spatial fields, allowing rooms to be virtually reconstructed and navigated in three dimensions. This process draws on photos taken for the system as well as from real estate listings, past design plans, video walkthroughs, or even receipts and annotated sketches. Each media type becomes another input into the evolving spatial profile — a visual twin that reflects both the visible and the inferred.
The result is a spatial twin: a digital model of a real environment that updates and adapts over time. Unlike static models, the LWM’s twin is layered with semantic intelligence — it understands that the kitchen is not just a box, but a functional zone with specific patterns of use, types of storage, and constraints on layout based on plumbing, light, and circulation. This twin is linked to real-world data: dimensions, lighting conditions, inventory, and even utility behavior if available. It can track change over time — noting, for example, that a room was painted last year, or that furniture has been moved to accommodate a growing family. It can reason about adjacency, flow, and access in human terms. It becomes a long-term substrate for spatial thinking, not a one-time diagram.
Importantly, the LWM treats the home as a temporal system that integrates time, change, and context. It tracks changes across seasons, across phases of life, across decisions made and unmade. Because planning is rarely a single event — and often unfolds over weeks, months, or years — the model incorporates a time-aware structure. This includes calendar integrations, usage tracking, and anticipatory planning for future events such as moves, renovations, or life milestones. The spatial twin therefore goes beyond mirroring space. It is a memory system and forecasting tool. It supports not just what is, but what was — and what could be.
III. Room-by-Room Intelligence
Not all square footage is created equal. A home is not a neutral volume to be filled, but a constellation of spaces — each with its own logic, constraints, and emotional weight. The kitchen is different from the bedroom, not only in function but in rhythm, accessibility, and social meaning. Recognizing this, the LWM treats each room as a site of specific intelligence. Room-level understanding is critical to personalization and planning. It’s where spatial and semantic knowledge converge. The system must know not just what a “bathroom” is in abstract, but what this bathroom needs — in light of its size, light, occupant preferences, and existing inventory.
Each room type has typical patterns of use, furniture groupings, ergonomic rules, and code-based requirements. Together, they define the affordances of space that grounds functional intelligence. The LWM models these affordances as a set of expectations, which it then compares against the spatial twin to flag mismatches and opportunities. Is there enough counter space for how the member uses their kitchen? Could the living room better accommodate hosting if reconfigured? Does the closet align with the member’s actual storage needs? By modeling room functionality through both general heuristics and individual behavior data, the system supports decisions that are both intelligent and highly contextual.
Rooms also carry symbolic and emotional meaning. They carry semantic intelligence that defines meaning beyond function. A child’s bedroom is different from a guest room, even if they are the same size. A dining room may be formal or casual, used daily or only on holidays. The LWM integrates these nuances by learning from each member’s habits, cultural background, and priorities — all recorded through onboarding, interaction history, and behavioral cues. This lets the system make proposals that go beyond optimizing utility. It can propose a reading nook not because it fits, but because it fits you — your history, your habits, your aspirations.
Because the LWM integrates inventory management at the object level, it knows what is already in each room — not just what could be added. This lets it model opportunities for reconfiguration, re-use, and refinement. A shelf in one room might work better elsewhere. A table may need to be upgraded based on wear or changing needs. These suggestions are rooted in both spatial fit and lifecycle understanding. Over time, this also enables proactive planning. If a child is outgrowing their room or if a renovation is approaching, the system can begin preparing recommendations based on predicted change.
Each room becomes a node where design intelligence and lived experience meet as design grammar meets daily life. The LWM holds a grammar of how spaces typically work — but it never forces that grammar. Instead, it negotiates between best practice and personal preference, creating room-by-room guidance that is usable, livable, and incrementally better. What emerges is not a rigid ideal home, but a system that adapts the logics of good design to each life it serves — room by room, phase by phase.
IV. Personalization at the Core
The challenge of personalization involves both knowing what someone likes as well as understanding who they are, how they live, and what they are trying to become — and then shaping their environment to support that evolution. In the context of the home, personalization must move beyond aesthetics to touch function, time, budget, values, and aspiration. The LWM is built with this complexity in mind.
The process moves from style recognition to value modeling. Every member begins with a distinctive pattern of taste, shaped by cultural influences, past purchases, and their existing surroundings. The LWM builds a multidimensional profile of each member, starting with visual preference recognition — identifying affinities for specific colors, forms, materials, and brands — and extending to deeper signals such as sustainability priorities, spatial needs, and emotional associations. This profile adapts based on feedback, purchases, plans, conversations, and project history. In this way, personalization becomes a process of mutual learning — the system learns the member, and the member learns what is possible.
The next step is to map preferences to actionable design space. The LWM maps member preferences into a design vector space that aligns with the style ontology used to classify all inventory on the platform. This makes it possible to match not just one chair to one taste, but to filter an entire design universe down to the subset most likely to resonate — and to evolve those recommendations over time. Beyond personal style, the system also integrates: Budget (absolute and per-room allocations, plus sensitivity to price-performance ratio), Schedule (urgency of delivery, project phasing, and coordination with upcoming milestones), Physical Constraints (existing inventory, spatial limitations, accessibility needs), and Emotional Goals (desires for comfort, status, creativity, calm, stimulation, or memory). These variables shape the “design brief” that the LWM builds and continuously refines. Personalization becomes about curating fewer, better, more relevant things — ones that fit both the space and the person over time.
Ultimately, the LWM becomes a long-term partner for planning and change. It maintains a memory of each member's projects, purchases, and patterns. It remembers which pieces were kept, which were returned, which were loved, and which were never used. It learns what “working” means in the context of a given household. This makes it uniquely suited to manage change — from a child growing up, to a move, to a new relationship, to aging in place. The system adapts and carries forward, offering continuity across phases of life. The results is a personalized domestic intelligence — capable of making the home more livable, expressive, and aligned with the people inside it.
V. Generative Design and Recommendation Systems
Design is traditionally a reactive process — you describe a need, and someone responds with a plan. With an LWM, design becomes generative: the system proposes, suggests, and evolves ideas in real time, based on your space, your preferences, and your goals. Beyond a recommendation engine, it is an active design partner, capable of imagining new possibilities and adapting them to reality.
The LWM supports the progression from constraints to proposals. It takes in a rich array of inputs — member preferences, spatial measurements, budgetary constraints, existing inventory, and project goals. From these, it generates design and planning recommendations that are more than filtered lists. These may include layout proposals that reposition furniture or add new pieces to improve flow, function, or aesthetics; object recommendations for specific furniture, lighting, or accessories that match taste and space; phased plans that reflect evolving needs or budgets over time; material swaps to reduce environmental impact or harmonize a palette; and moodboards or visual previews of suggested combinations, layouts, and styles. These proposals are always grounded in reality — the known spatial twin, the known inventory, the known schedule, related service providers. At the same time, they are expressive, opening space for imagination and delight.
This capacity leads to real-time, responsive interaction. The generative layer is interactive. Users can ask questions (“What could this room look like with a darker rug?”), reject proposals, favorite options, or change constraints midstream. The system responds by updating its suggestions, modifying the plan, or offering explanations. This conversational and visual loop creates a more fluid and accessible design process — one that reduces the cognitive burden of decision-making while increasing the quality of the results. In contrast to e-commerce interfaces that show everything, the LWM shows only what makes sense — narrowing an infinite universe into manageable options.
Beyond recommending products, the LWM is also capable of designing experience. It can suggest actions — to rearrange, to repaint, to donate, to schedule — and tie them to service providers, events, and lifecycle needs. In this way, design expands beyond aesthetic coordination to encompass the lived experience of home: what to do next, how to get it done, and how it fits into a larger rhythm of care. The system can answer a question as straightforward as, “Which chair?” as well as more complex questions such as: “Should we reconfigure this room now or later?” “Can this piece be repurposed?” “What will this look like once it arrives?” “Who can help me with this step?”
Generative recommendations are tied directly to fulfillment pathways. If a member accepts a suggestion — whether a furniture set, a room layout, or a project plan — the platform can coordinate execution: purchasing, delivery, setup, disposal, or ongoing maintenance. This seamless handoff from suggestion to action transforms inspiration into tangible change. In this way, the LWM is a transformation engine — turning vague desires into precise instructions, then into structured action.
VI. Between Text and Image, Vector and Pixel
A fundamental shift in how we understand and represent the home lies at the core of the LWM. Traditional AI systems, particularly Large Language Models (LLMs), operate primarily in the domain of text — abstracting human intent into sequences of tokens, then returning structured output. The LWM operates differently: it inhabits a continuous space between text and image, between vector and pixel — capable of moving fluidly across media, from conversation to plan, from prompt to projection.
Here, a vector is not merely a mathematical encoding of language or image, but a trajectory: a statement of intent that is inherently spatial, temporal, and actionable. It represents the direction of a member’s desire — to furnish a room, to define a new phase of life, to create a specific atmosphere — and it carries with it a history of context, preference, and constraint. A pixel, in contrast, is a momentary rendering — the visible resolution of many such vectors, brought together to generate an image, plan, suggestion, or purchase.
The LWM is trained to model the relationship between these two dimensions. This mirrors the logic of architectural creation, where vision emerges from an evolving conversation between text, sketch, diagram, model, and built form. Today’s tools allow that relationship to unfold across media simultaneously — enabling real-time convergence between intent and execution. Inventory and metadata are gathered and encoded from across fragmented ecosystems. Each object is assigned a vector value tied to OurThings’ style matrix and material ontology. Members' preferences are mapped onto this style-space via their actions, tastes, and environments. Project parameters — room type, layout, schedule, climate, budget — narrow the design envelope. Generative models then create recommendations grounded in this multidimensional vector field. Transactions occur through API integrations with external points of sale, triggered by user or agent action. Every choice, adjustment, or refusal is logged and weighted, creating a longitudinal memory of the project.
A system emerges that gives reality to a question, a desire, or an aspiration. It becomes a system where the poetic can be inhabited — made literal through design, planning, and transaction. This new feedback loop between user, space, and system allows homes to be more responsive and become repositories of meaning over time — a narrative made tangible in material form. Design, in this world, becomes post-stylistic. Because all styles are accessible, the work becomes not about imposing a singular vision, but negotiating needs — curating ethical materials, sustainable practices, and human-centered experience. The LWM unifies the previously fragmented domains of design, commerce, memory, and interface into a single system of truth — one capable of reading across file types, spatial data, user profiles, and evolving tastes. In doing so, it brings unprecedented coherence to the way we shape, understand, and inhabit the world.
VII. The Spatial Web as Platform
The true power of the LWM lies in its combination of intelligence and connectivity. A home is more than a private space—it functions as a node within a vast and fragmented ecosystem of goods, services, logistics, utilities, cultural values, and social relationships. By anchoring that ecosystem to a spatial model of the home, the LWM enables a new kind of infrastructure: the spatial web and commerce. It becomes a web made of rooms, objects, people, and events — each linked through semantics, intent, and transaction. In this web, the home becomes an interface, and the world becomes accessible through it.
Ultimately, the spatial twin must be linked to the world. Every spatial twin is both a a 3D model and a dynamic interface layered with live connections. Product metadata links every object to its brand, supplier, availability, lifecycle data, and resale potential. Service integrations connect design decisions to installation, maintenance, repair, or insurance. Platform-wide data enables comparisons, benchmarks, and community insight (e.g. “others with similar spaces chose this configuration”). Scheduling systems sync projects with calendars, delivery timelines, and seasonality. Sustainability metrics quantify impact and support better choices across space and time. These connections allow the platform to operate as a real-time planning and execution layer — where decisions are no longer confined to static checklists or isolated apps, but unfold across space and time as orchestrated workflows. This makes it possible to address a range of challenges that might arise. If a delivery is delayed, the system can adjust the project timeline and notify the relevant service provider. If a user wants to rearrange a room, the system can check compatibility with existing items, propose new layouts, and identify movers or installers. If a planned renovation triggers insurance requirements or city permits, the platform can surface those pathways and offer assistance. The goal is ultimately to augment it with a system that sees the whole: space, objects, services, timing, and constraints — and coordinates among them.
The result is a new interface for domestic life. Where previous platforms have centered on transactions or content, the spatial web centers on place. This means that interaction flows through spatial logic: members browse by room, act within context, and see impact as it will appear in their real lives. It transforms the abstract interface into an embodied one — visual, grounded, and immediate. In this world, the home is no longer just something to decorate. It becomes a living interface — through which planning, communication, memory, and meaning are continuously shaped.
XIII. The Home as Responsive System
The home has always been more than shelter. It is a memory system, a social structure, a site of identity and change. Yet the tools we’ve used to shape and manage our homes have failed to meet this complexity. They are too static, too fragmented, too abstracted from real life. The LWM that we are building offers — through spatial intelligence, generative design, personalization, and platform integration — a new way of thinking about the home as a responsive system. This system begins with the world as it is — the messiness of real homes, real needs, and real constraints. It then builds a layered, evolving understanding of space, behavior, desire, and change. Rather than designing in the abstract, it supports real people as they navigate life inside their homes: planning, adapting, making choices, and investing in their environments over time.
In this model, the home becomes a long-term collaboration — between people and AI, between past and future, between aspiration and action. It becomes a site of intelligence and care. This approach goes beyond seeking technical advancement to seek the redefinition of how domestic life can be understood and supported in the digital age. It suggests a new standard for what platforms can be: both as engines of consumption and entertainment as well as partners in the fundamental task of shaping space, meaning, and quality of life. The Large World Model makes visible what was hidden, makes possible what felt overwhelming, and makes personal what once felt generic. It bridges the gap between what we imagine and what we inhabit. And in doing so, it helps restore something essential: the ability to feel at home in the spaces in which we live — both now and as we evolve across phases of life.