Meta Patents a System That Scans Real Clothes Into Virtual Avatars
Meta has filed a patent for a pipeline that takes a 3D scan of a person wearing real clothes and reverse-engineers the garment — its cut, its fabric physics, even how it drapes — into a form that can live inside a virtual world.
What Meta's real-to-virtual clothing scan actually does
Imagine pointing a 3D camera at yourself in your favorite hoodie and, a few seconds later, that hoodie exists inside a virtual world — not as a flat texture slapped onto a mannequin, but as a real simulated piece of fabric that moves, bunches, and wrinkles the way fabric actually does. That's the core idea here.
Meta's system takes the raw scan data and simultaneously figures out two things: what the garment looks like as a sewing pattern (the flat panels a tailor would cut) and what your body shape and pose look like underneath it. Those two outputs feed into a physics simulation that models how the cloth actually sits on your body.
Then the system runs an optimization pass — basically asking, "does my simulated version match the real scan?" and nudging the pattern, the fabric stiffness, and even your body shape until they agree. The end result is a garment asset you could drop onto any avatar, in any pose, and have it behave like real fabric.
How Meta's simulation-optimization loop rebuilds a garment
The pipeline has four main stages that run in sequence, with the middle two doing the heavy lifting.
Stage 1 — Scan ingestion: The system receives 3D scan data of a clothed person. Think of this as a detailed point cloud or mesh — millions of data points capturing the surface geometry of both the person and the clothes they're wearing at once.
Stage 2 — Simultaneous decomposition: Rather than solving for the body and the garment separately (which is hard because they're entangled), the system extracts both at the same time: an initial garment pattern (the 2D cut of the fabric), the garment material properties (stiffness, stretch resistance), plus the wearer's body shape and pose.
Stage 3 — Cloth simulation: Those outputs feed into a cloth simulation engine — software that models how fabric drapes under gravity and contact forces — producing an initial 3D garment state.
Stage 4 — Differentiable optimization: This is the clever bit. The system uses differentiable simulation (meaning the simulator can compute gradients — essentially calculate which direction to nudge each variable to reduce error) to iteratively refine the pattern, material, pose, and body shape together until the simulated garment matches the original scan as closely as possible. The output is an optimized garment pattern that can be re-simulated on any body.
What this means for avatar fashion and virtual try-on
For Meta, this is core infrastructure for the metaverse's most persistent unsolved UX problem: getting people to actually care about their avatars. Avatar fashion is already a multi-billion-dollar market in games like Fortnite and Roblox — but those clothes are hand-authored by artists. A scan-based pipeline could let you wear your actual wardrobe in VR, or let brands offer virtual try-on that genuinely reflects how fabric moves on your body type.
There's also a practical angle for e-commerce and AR. If Meta's Ray-Ban smart glasses or a future headset can scan garments at purchase or at home, retailers could offer returns-reducing virtual fitting that actually accounts for drape and fit — not just size. The optimization loop's ability to recover sewing patterns is particularly notable: it suggests the output could be reused across simulation engines, not locked to one platform.
This is legitimately interesting work, not a vague patent land-grab. The simultaneous body-garment decomposition combined with differentiable cloth simulation is a real technical approach that researchers have been pushing toward for years — and the fact that Meta's Reality Labs team filed this suggests it's closer to a product pipeline than a research curiosity. Whether it ships as a Quest feature, a Horizon avatar tool, or something else entirely, this is the kind of foundational IP that matters.
Get one Big Tech patent every Sunday
Plain English, intelligent commentary, no hype. Free.
Editorial commentary on a publicly published patent application. Not legal advice. Patentlyze may earn a commission if you click an affiliate link and make a purchase. This doesn't affect what we cover or how we cover it.