In November 2025, BBC included WEARFITS in an article about fashion’s sizing crisis. We read it while debugging a rendering issue with a shoe model that kept floating 3cm above someone’s foot. The article was excellent — Shiona McCallum did a thorough job covering the £190 billion returns problem. Our CEO Łukasz Rzepecki sent the team a screenshot. We appreciated the recognition.
But here’s the thing: we’ve been doing this since 2019. We’ve been covered by BBC, Business of Fashion, Vogue Polska, the European Commission, Elle, Glamour, Fashion Biznes, MIT Sloan Review, and cited in peer-reviewed academic research. We’ve given over 14 conference talks, appeared in 8 podcast and video segments, and built virtual try-on technology that’s been tested by journalists, researchers, and real brands shipping real products.
That’s not a newcomer’s CV. That’s almost seven years of building, learning, and shipping — long before most of today’s VTO startups wrote their first line of code.
So here’s what we’ve learned from being on both sides of the story — the press version and the production version.
Looking back at our press folder is like watching a technology mature — from curiosity to infrastructure. And because we’ve been here from the beginning, the timeline of media coverage maps directly to what we were building at each stage.
In September 2019, MIT Sloan Review PL ran a piece on our comfort advisor technology. Our CEO Lukasz did the interview. VTO was a curiosity — the kind of thing journalists filed under “future of retail.” But we were already building the core technology that powers our products today.
Then in February 2020, the European Commission wrote a full article about our virtual fitting room — noting the EUR 744,704 total investment, including EUR 529,308 from the EU Regional Development Fund. A proper, detailed write-up about the technology behind what we were building.
Most of today’s VTO startups didn’t exist yet. We were already deep into the engineering.
Between 2021 and 2022, the fashion press started paying attention. Elle quoted us on fashion trends. Glamour featured our work at their conference, where we demonstrated live AR try-on on stage. The coverage was growing — and importantly, it was coming to us, not the other way around.
The real validation came in 2023, when Luxus Magazine did something none of the previous journalists had done: they actually tested our product. A journalist used our Hockerty integration — the custom footwear brand — and wrote about the real experience of trying on shoes through a screen. Not a press release. Not a demo video. An actual hands-on test, published in a French luxury magazine.
That’s the kind of third-party validation you can’t buy. And it came because the product was ready for real-world use, not just conference demos.
In June 2024, Business of Fashion published a piece on Eastern Europe’s fashion-tech boom. Marc Bain wrote: “Poland is home to fashion-focused start-ups such as virtual try-on provider Wearfits.” BoF wasn’t asking “will VTO work?” anymore. They were mapping an ecosystem. That’s a meaningful shift.
Then in 2025, BBC and Vogue within the same month. BBC’s Shiona McCallum covered the £190 billion sizing crisis and listed us among the solutions. Vogue Polska ran a print interview with Kasia Gola as fashion data expert. Meanwhile, researchers at De Gruyter cited WEARFITS in a peer-reviewed textile e-commerce paper — we didn’t even know about it until we found it.
Academic recognition from researchers who independently chose our technology as a reference. BBC, BoF, and Vogue coverage from journalists who found us, not the other way around. That’s not marketing. That’s credibility built over seven years of consistent work.
And in January 2026, our team wrote a piece for Fashion Biznes ranking VTO startups seen at Web Summit. The question was no longer “will virtual try-on work?” but “which solution works best for which use case?” We’re the ones experienced enough to answer that question — because we’ve been building across categories (shoes, bags, and now apparel) longer than most.
The press has done a solid job of explaining three things that we’ve seen firsthand from the inside:
BBC’s £190 billion annual returns figure is not hype. If anything, it’s conservative. The Fashion & Textile Association called it a “downward spiral” of cheap clothing produced to offset return costs, and they’re right. Every fashion brand we talk to — every single one — lists returns in their top three operational problems. We’ve been working on this problem since 2019. The media is catching up to what we’ve known for years.
Five years ago, VTO was a conference demo. Now it’s on product pages. That transition from “cool thing we saw at a trade show” to “feature our e-commerce team needs to evaluate” is real, and the press has tracked it well. Fashion Biznes noted that 42% of online shoppers feel product photos don’t represent them, and 59% are disappointed when the product looks different than expected. VTO is shifting from a nice-to-have to something brands actually budget for.
The media has correctly identified that the conversation has moved from “should we experiment with VTO?” to “which VTO solution fits our stack?” Brands are issuing RFPs, comparing vendors, asking about integration timelines and catalog coverage. We’ve been through enough of these procurement cycles to know: the brands that buy are the ones who’ve done their homework. And the media coverage is helping them do that homework faster.
The media gets three important things wrong about VTO. And because we’ve been in this industry longer than most, we see the consequences of these misunderstandings every time a brand comes to us after a failed pilot with the wrong technology.
This is the big one. The BBC article listed us alongside sizing tools like True Fit and EasySize, AR try-on companies, and Gen AI clothing solutions. And on the surface, that makes sense — we’re all trying to reduce the gap between what a customer expects and what they receive.
But these are fundamentally different technologies solving different parts of the problem. A sizing recommendation tool analyzes body measurements and suggests a size. An AR shoe try-on overlays a 3D model of a shoe onto your actual foot in real time. A Gen AI clothing try-on generates an image of what you might look like wearing a garment. These require different engineering, different data, different integration approaches, and they produce different results.
Our core is visual try-on — showing customers how products look on them before they buy. That’s what we’ve spent seven years perfecting. But within visual try-on alone, AR shoe rendering and Gen AI apparel rendering are distinct disciplines. Lumping them together would be like writing an article about “transportation technology” and treating bicycles, electric cars, and cargo ships as variations of the same thing. They all move things from A to B, but that’s about where the similarity ends.
This matters because when a brand reads one of these articles and decides to “try VTO,” they often don’t know what they’re actually shopping for. And that leads to mismatched expectations, failed pilots, and a general sense that “VTO doesn’t work” — when really, the wrong type of VTO was applied to the wrong problem.
Almost every article about VTO includes a description of how impressive the demo looks. And yes, demos look impressive. That’s the point of a demo.
But the real challenge isn’t making one shoe appear on one foot. It’s making 10,000 shoes appear on millions of feet without crashing. It’s handling the asset pipeline — taking thousands of product photos, converting them to VTO-ready 3D models, managing seasonal refreshes, maintaining quality across different shoe categories. It’s CDN architecture, device-specific optimisation, and a backend that doesn’t fall over when a major retailer puts your widget on their homepage during a sale.
None of that makes headlines. But it’s the difference between a pilot and a production deployment. We’ve built this infrastructure over seven years — 73,000+ try-on sessions processed in 60 days for one deployment alone. That kind of reliability doesn’t come from a demo. It comes from years of engineering.
Getting VTO to work in a demo is table stakes. Getting it to work on a €15/month Android phone, over a 3G connection, in direct sunlight, while the customer is walking — that’s the actual product.
Sixty to seventy percent of online shoppers browse on mid-range Android devices. Not the latest iPhone. Not a desktop with a fast connection. The phone they bought two years ago, with a screen that’s slightly cracked, running three other apps in the background, connected to whatever Wi-Fi or mobile signal they happen to have.
This is the environment where VTO has to work. And it’s an environment that’s almost never represented in media coverage, because journalists test products on new devices in well-lit offices. The last mile is not glamorous, but it’s where adoption lives or dies.
Seven years of building VTO across shoes, bags, and now apparel has given us a perspective that newer entrants simply don’t have yet. Here’s what we’ve learned:
Early on, we focused on building impressive technology: advanced 3D rendering, sophisticated tracking algorithms. The kind of engineering that wins awards. But we learned that brands don’t evaluate VTO by polygon count. They evaluate it by how quickly it integrates with their Shopify store, how easily their catalog team can manage it, and whether it actually moves their conversion numbers.
That insight reshaped everything we build. Today, our integration paths — Shopify plugin, web widget, API — are designed around the brand’s workflow, not our architecture. That’s a lesson that took years and real deployments to learn.
We care deeply about quality. But the honest truth is that a fast, easy-to-integrate try-on that’s 90% perfect will outperform a slow, hard-to-deploy try-on that’s 100% perfect. Every time.
That’s why we built our photo-to-AR technology — generating VTO-ready assets from a single product photo — and why our embeddable components are designed to be lightweight enough that they don’t compete with the brand’s product page. Quality matters. But quality without operationalisability is just a demo.
Conference demos look incredible. Beautiful lighting, carefully selected hero products, full-screen experiences. But a real deployment means your widget coexists with other page elements, loads on mid-range devices, and works within the brand’s existing design system. We’ve rebuilt our entire approach around lightweight, embeddable components that play nicely with real e-commerce pages. That’s the product. The demo is just the introduction.
We’re not going to make sweeping predictions. Instead, here’s what we’re actually building and seeing from inside the company — not from reading about it, but from shipping it.
First, Gen AI is changing the game for apparel try-on. We’ve just launched a Gen AI virtual try-on for apparel, and it’s a different approach to the problem than AR shoe rendering. Instead of overlaying a 3D model in real time, it generates a realistic image of you wearing the garment. The engineering is different, the user experience is different, and the possibilities are different. AR and Gen AI aren’t competing — they’re complementary tools for different product categories.
Second, the line between “try-on” and “content creation” is blurring. When you can generate an image of a customer wearing your product, is that a try-on tool or a marketing tool? The answer is increasingly both. We’re seeing brands think about VTO not just as a conversion feature but as a content pipeline.
Third, the winner won’t be the company with the best AI alone. It will be the company that combines quality AI with the easiest integration, the fastest speed, and the deepest industry knowledge. We’ve spent seven years accumulating that knowledge across shoes, bags, and apparel. That’s not something you can shortcut.
And we’re not done. What you see today — AR try-on for shoes and bags, Gen AI for apparel — is just what’s already live. Behind the scenes, we’re working on the problems that actually hold the industry back: affordable, fast digital twins that any brand can create without a 3D team. Accurate sizing that works from a photo. The kind of real business challenges that sound boring in a headline but save brands millions in returns and lost revenue.
There’s plenty more in the oven. We’ve been taking the scene since 2019, and we plan to keep doing it — solving one breakthrough problem at a time.
We’ve been at this for seven years. We’ve gone from an EU Commission write-up and zero paying customers to real deployments with real brands, 73,000+ try-on sessions in 60 days, academic citations, and coverage in BBC, Business of Fashion, Vogue, and a dozen other publications.
We’re grateful for every media mention. But the real version of WEARFITS isn’t the headline version. It’s a team in Krakow that’s been building virtual try-on longer than most of the companies now entering the space have existed. A team that’s obsessed with the hard problems: making digital twins affordable and fast enough for any brand, cracking the sizing challenge, and turning every product page into a fitting room that actually works on real devices for real customers.
The industry is waking up to virtual try-on. We’ve been awake since 2019 — and we’re not slowing down. The next chapters are already being written.
If you want to see what’s live today — and get a peek at what’s coming — we’d love to show you.
Curious what VTO looks like beyond the headline?
We’ll show you the real thing — including the boring parts.