Research Methodology | Knowledge Sourcing Intelligence (KSI)

Knowledge Sourcing Intelligence (KSI)

Research Methodology

Our methodology is built on a structured “Market Engineering” workflow that defines market boundaries, builds bottom-up and top-down models, validates with primary inputs, and reconciles outputs with a documented audit trail.

What this delivers

A defensible market size and forecast that reconciles across segments and years, supported by transparent assumptions, validation logic, and a consistent estimation framework.

Market boundary discipline Bottom-up construction Top-down validation Primary validation Reconciliation & audit trail
Define Objective & boundaries Data Architecture Sources, filtration & triangulation Bottom-up Company revenue & mapping Top-down Indicators, penetration, ASP Forecasting Drivers + scenarios volatility checks Primary Validation Interviews, pricing, adoption, channels Reconciliation Normalize gaps, document changes Final Outputs Market size + forecast audit trail + confidence

1) Defining the Research Objective

Market boundaries

  • Inclusions and exclusions (what is in-scope vs out-of-scope).
  • Segmentation hierarchy (product/type/application/end-user/region).
  • Stakeholders and value-chain layers (OEMs, distributors, EPCs, end users, service providers).
  • Base assumptions and validation logic to avoid double counting.

What “market size” means in our framework

Revenue reference point: estimates represent the revenues generated at the manufacturer (or service provider) level within the defined market boundary.

Note: exact revenue layers (ex-works vs channel-in vs end-user spend) are defined explicitly in scope.

2) Data Engineering, Filtration & Triangulation

We build a clean data architecture by classifying sources, extracting comparable variables, and triangulating across multiple validation paths.

Secondary research (examples)

  • Annual/quarterly reports, earnings decks, and segment notes.
  • Product catalogs, technical documentation, press releases, and industry publications.
  • Official databases and statistical agencies (e.g., World Bank, OECD, Eurostat, Statistics Canada, relevant national bureaus).
  • Trade statistics (where applicable) for import/export and “grey leakage” checks.

Primary research (examples)

  • Expert interviews with manufacturers, distributors, EPCs/OEMs, and end users.
  • Pricing and discount structures (ASP bands by spec, volume, and channel).
  • Adoption and replacement signals (penetration, upgrade frequency, project cadence).
  • Channel structure validation (markups, lead times, procurement behavior).
Triangulation rule: no single dataset is treated as “truth.” We cross-validate estimates across independent datasets and normalize inconsistencies before finalizing outputs.

3) Market Size Construction

Bottom-up construction

We identify leading manufacturers/importers/system integrators, extract segment-specific revenues, map product portfolios to the study scope, and estimate private company revenues using capacity proxies, pricing bands, and distributor signals.

  • Company revenue mapping → product category mapping → application and end-user mapping.
  • Installed base validation (penetration levels, lifecycle/replacement cycles).
  • Trade deltas (where relevant) to detect import-driven gaps or informal/grey channels.

Top-down construction & validation

We build a top-down addressable range using macro and sectoral anchors (industrial output, electricity consumption, capex cycles, infrastructure pipeline, sector growth), then translate demand to value using ASP bands and adoption/penetration logic.

  • Select anchor indicators and run correlation/regression checks to validate directionality.
  • Build low/base/high ranges and compare against bottom-up outputs by year and segment.
  • Diagnose gaps: coverage, price assumptions, informal market, imports, channel markups, and double counts.
Framework Component Description
Historical pattern modeling Construct time-series datasets from validated historical inputs, identify volatility cycles and structural breaks, and normalize anomalies using macro/industry indicators.
Driver-based forecast Map growth to macro, sectoral, and technology drivers; apply weighted regression or driver-linked logic; adjust for adoption cycles and regulatory/standards signals.
Scenario & sensitivity testing Develop base/optimistic/conservative scenarios; test pricing, penetration, and replacement assumptions; ensure segment totals remain consistent over time.
Final estimate preparation Produce the market size and forecast trajectory, reconcile segment totals to the overall market, and document assumptions, drivers, and calculation logic.

4) Reconciliation, Confidence Bands & Audit Trail

  • Reconcile top-down ranges vs bottom-up outputs by year and segment.
  • Normalize inconsistencies and remove double counts (channel overlaps, portfolio overlaps).
  • Document assumption changes with rationale (pricing, penetration, coverage, imports).
  • Provide confidence bands (low/base/high) where appropriate.

Auditability: our internal workflow preserves an adjustment trail so that every major delta (coverage, markups, private players, trade leakage) has an explainable basis.

This is especially important in markets with fragmented suppliers, limited disclosures, or heavy channel complexity.

5) Deliverables

Quantitative outputs

  • Market size (base year) + 5–10 year forecasts.
  • Segment splits that reconcile to totals (year-wise).
  • Scenario ranges and key assumption tables.

Qualitative outputs

  • Market definition + scope and segmentation definitions.
  • Drivers, constraints, opportunities, and competitive landscape.
  • Strategic recommendations aligned to procurement, channel, and go-to-market realities.

Notes on Sources

Specific sources vary by market and geography. We prioritize official statistics, industry association releases, company disclosures, and validated primary interviews. Where direct data is unavailable, we use transparent proxies (capacity, installed base, trade mapping, pricing bands) and clearly document the logic.