OddsPortal is a sports odds comparison platform. The product was outdated and losing ground. The task was simple: redesign it and bring in more users. What wasn't there: analytics, user interviews, internal data of any kind.
01 · The Constraint
No way to know who was using the product, how, or why they left. That's not just a research gap — it's a different kind of design problem.
The constraint became the direction. If there's no first-party data, you build a method that generates its own evidence. You make decisions defensible through rigor, not through data you don't have.
02 · Building Evidence
Industry surveys filled the demographic gap. FSGA and SBA data painted a clear picture: 80% male, 50% between 18–34, and — critically — betting behaviour directly linked to research time. More than half of heavy bettors spend over an hour researching every event they bet on. The product's job wasn't just to display odds. It was to support a research ritual that already existed in the user's life.
Benchmarking
OddsChecker and DraftKings set the standard for what good looks like in this space: complex information handled through clear architecture, betting flows reachable in 2–3 clicks. A documented OddsChecker redesign case study from 2019 provided validated pain points from an equivalent product — a shortcut to insights that would normally require primary research.
The opportunity tree
With an open scope and no internal data to filter by, I needed a framework that made prioritisation explicit rather than intuitive. Starting from a single North Star — increase the number of users going to a specific sportsbook — I mapped three strategic outcomes: more site visits, more coupon clicks, more direct sportsbook conversions. Every hypothesis flowed from those. Twelve experiments scoped, each tied to a specific outcome, not a wish list.
03 · Where the Design Landed
Information architecture
The existing structure scattered content types across the same surfaces. The redesign gave each one its own dedicated page — results, leagues, teams, odds — with clear navigation between them. Popular games and leagues were surfaced on the home screen, following the pattern established by the strongest competitors.
Vocabulary
Sports betting language varies by region and product. For new users, it's a real learning curve. The design leaned into common vocabulary aligned with products users already knew, reducing cognitive load without dumbing the product down. A glossary was scoped as a low-resource feature with long-term SEO and new user acquisition value. Few competitors had one.
Growth surfaces
Four experiments from the opportunity tree made it into the design: a terms glossary, a newsletter, a betting tips section, and a top sportsbooks list. Each tied to a specific business outcome in the tree. Not features — testable hypotheses.
Affiliate model
The existing ad units broke the experience with generic placements. The redesign replaced them with contextually relevant affiliate offers — qualified leads for sportsbooks, and better UX for users. The bet button was redesigned to route directly to the sportsbook in one click. A friction point that had been costing conversions every time.
Visual identity: dark theme, built on evidence — ~85% of users in this category prefer it. For a product used in high-focus sessions, often at night, it was a usability decision before it was an aesthetic one. The logo was rebuilt from scratch; the existing identity didn't survive the new visual system.
04 · Honest Results
The redesign shipped a complete visual overhaul and a structured growth strategy. Four growth hypotheses scoped and ready to test. Affiliate model rebuilt. New IA. New identity.
The gap
No user testing on the final UI. The design was grounded in research, but it went from concept to final without anyone sitting in front of it. For a product with this level of information density — multiple odds, multiple sportsbooks, multiple event types on one screen — that's a real gap.
The other gap
The experiments still need to run. An opportunity tree is a prioritisation tool, not a delivery. Without instrumentation and a testing cycle, the hypotheses stay hypotheses.
If I were starting over, I'd scope the feedback loop into the brief from day one. Smaller initial deliverable if needed — but a path to validation built in from the start.
05 · What's Next
The growth strategy is ready. What it needs is a product team willing to run it. Instrument the product. Run the four scoped experiments against defined success metrics. Test the IA with real users. Build out the personalisation layer — team-targeted content and geolocation-based sportsbook recommendations are both in the opportunity tree, waiting.
The infrastructure is there. The hypothesis is sound. The return depends on what happens next.