Report Structure Analysis

Report Structure Analysis

Analysis of 4 example due diligence reports (Credora, QuestCorp, RedStone, Senna) to understand current structure for designing the Report Viewer.

Related docs: This analysis informs the Report Viewer design (see ux-master-brief.md Section 9-10). For data model implications, see tech-start.md. For visual treatment of citations and evidence, see visual-style-brief.md.


1. Overall Report Structure

All reports follow a consistent hierarchical structure with numbered sections:

Standard Section Hierarchy

1. Introduction
   1.1. Project Overview
   1.2. Project Background
   1.3. Key Stakeholders
   1.4. Project Timeline

2. Research Findings
   2.1. Market Research Findings
   2.2. Competitive Landscape
   2.3. Technology & Product Analysis
   2.4. Business Model Validation
   2.5. External Validation
   2.6. Conflicting Information
   2.7. Other Information

3. High Priority Issues
   3.1. [Issue Title]
   3.2. [Issue Title]
   ... (typically 7-10 high priority issues)

4. Other Issues
   4.1. [Issue Title]
   ... (typically 10-18 lower priority issues)

5. Potential Opportunities / High-Value Opportunities
   5.1. High-Value Opportunities (Detailed Analysis)
   5.2. [Opportunity Title]
   ... (typically 3-6 detailed opportunities)

6. Other Opportunities
   6.1. [Opportunity Title]
   ... (typically 5-10 additional opportunities)

7. Conclusion

Variations by Report Type

ReportUnique Sections
QuestCorpHas "Executive Summary" at the start + "Key Questions and Areas for Further Investigation" before Conclusion
SennaHas "Questions for Project Team" section (7 detailed questions) before Conclusion
Credora/RedStoneStandard structure

2. Introduction Section Structure

1.1 Project Overview

Contains these consistent subsections:

  • Core Value Proposition - What the project does, key technology, current status
  • Target Market - Primary/secondary market segments, evidence of product-market fit
  • Current Stage - Development status, key deployments, regulatory approvals
  • Key Metrics - Structured data including:
    • Corporate status
    • Intellectual Property
    • Funding History
    • Current Ask (if applicable)
    • Team Size
    • Financials (revenue, burn rate if available)
    • Developer Adoption metrics

1.2 Project Background

  • Founding Story - Incorporation details, founding team, origin story
  • Problem Statement - What problem the project solves
  • Solution Approach - Technical methodology
  • Unique Differentiators - Key competitive advantages

1.3 Key Stakeholders

  • Founding Team - Bio for each founder with verified credentials
  • Advisory Board - If applicable
  • Key Investors - Lead investors, funding rounds
  • Strategic Partners - Major integrations and partnerships
  • Community/User Base - User metrics, community programs
  • Digital Authority Signals - Publications, GitHub activity, certifications
  • Network Quality & Access - Connections and ecosystem access
  • Execution Track Record & Communication Consistency - Historical patterns

1.4 Project Timeline

Chronological list with dates:

YYYY-MM-DD: Event description
YYYY-MM-DD: Event description
Planned (6-12 months): Future roadmap items

3. Research Findings Structure

2.1 Market Research Findings

  • Market Size & Growth - TAM/SAM/SOM with sources and dates
  • Industry Trends - Key market dynamics
  • Regulatory Environment - Relevant regulations and compliance
  • Geographic Analysis - Regional considerations
  • Customer Segments - Target customer profiles
  • Market Timing & Catalysts (if applicable)

2.2 Competitive Landscape

  • Direct Competitors - Named competitors with positioning
  • Indirect Competition - Alternative solutions
  • Competitive Advantages - Strengths vs competitors
  • Market Positioning - Where the project sits
  • Barriers to Entry - What protects the market

2.3 Technology & Product Analysis

  • Technical architecture review
  • Validation findings
  • Scalability assessment
  • Technical risks

2.4 Business Model Validation

  • Revenue model
  • Cost structure
  • Financial health
  • Unit economics
  • Partnership ecosystem

2.5 External Validation

  • Third-party assessments
  • Audits and certifications
  • Expert/community sentiment

2.6 Conflicting Information

  • Discrepancies found in research
  • Unverified claims
  • Contradictory sources

2.7 Other Information

  • Additional relevant findings
  • Capital structure
  • Management structure

4. High Priority Issues Structure

Full Issue Format (used in Section 3)

Each high priority issue follows this structure:

### 3.X. [Issue Title]

**Issue Description**
[2-3 paragraphs describing the issue, why it matters, and time horizon]

**Assessment**
- Time Horizon: Short/Medium/Long + specific timeframe
- Complexity: Low/Medium/High (X-Y% of resources); [description]
- Likelihood: Low/Medium/High; [probability and reasoning]
- Impact: Low/Medium/High; [specific consequences]
- Priority: HIGH/MEDIUM/LOW; [justification]

**Potential Impact**
[Paragraph describing revenue, operational, and reputational consequences]

**Contributing Factors & Likelihood**
[Paragraph with evidence for likelihood assessment]

**Current Mitigation Measures**
- Current Stage Options: [immediate actions possible]
- Next Stage Options: [actions for next funding round]
- Resource Requirements: [budget and team allocation]

**Supporting Evidence**
[Specific dates, transactions, quotes, observations that support the issue]

**Verification Methods**
[How to validate/verify the issue is resolved]

Condensed Issue Format (used in Section 4)

### 4.X. [Issue Title]

**Issue & Time Horizon:** Short/Medium. [Brief description]

**Assessment Summary:** Complexity: X; Likelihood: X; Impact: X; Priority: X
with the justification [reason].

**Evidence & Stage Context:** [Key evidence and stage-appropriate context]

**Mitigation & Verification:** [Combined mitigation steps and verification]

5. Opportunities Structure

Full Opportunity Format (used in Section 5)

### 5.X. [Opportunity Title]

**Opportunity Description**
[2-3 paragraphs describing the opportunity and its strategic value]

**Assessment**
- Novelty: Standard/Emerging Pattern/New; [description]
- Time Horizon: Immediate (3-6 months) / Next Stage (6-18 months)
- Effort: Low (<10%) / Moderate (10-30%) / Substantial (30-60%)
- Testability: Highly Testable / Moderately Testable + [resources needed]
- Value: Low/Medium/High (Nx) [multiplier potential]

**Potential Impact**
[Specific revenue and strategic impact projections]

**Contributing Factors & Feasibility**
[Why this opportunity is achievable]

**Required Resources & Conditions**
[Team, budget, dependencies, prerequisites]

**Supporting Evidence**
[Market data, precedents, partner interest]

**Verification Methods**
[How to test/pilot the opportunity before full commitment]

Condensed Opportunity Format (used in Section 6)

### 6.X. [Opportunity Title]

**Opportunity & Value:** [Description]. This opportunity has [High/Medium] value potential.

**Assessment:** Novelty: X; Time Horizon: X; Effort: X; Testability: X; Value: X

**Feasibility & Requirements:** [Key requirements and dependencies]

**Evidence & Validation:** [Supporting evidence]

6. Questions for Project Team (Senna-style)

Each question follows this detailed structure:

### 7.X. Question X: [Topic]

**Question:** [The specific question to ask]

**Why This Question Matters:**
[2-3 sentences explaining the strategic importance]

**What Strong Answers Reveal:**
- Scenario 1 - Exemplary Response: [What a great answer looks like]
- Scenario 2 - Adequate Response: [What a passable answer looks like]
- Scenario 3 - Weak/Concerning Response: [Red flag responses]

**Interpretation Guidance:**
[How to interpret different responses and follow-up actions]

**Connection to Issues/Opportunities:**
[Links to specific issues and opportunities this validates]

7. Evidence and Citation Format

Inline Evidence Style

Evidence is embedded within text with specific dates:

  • "On October 8, 2025, specific operational issues were observed..."
  • "A key production transaction from January 9, 2025, on Arbitrum One..."
  • "As of October 2025, the unlocked supply had increased to..."

Supporting Evidence Sections

Located within each Issue/Opportunity block:

**Supporting Evidence**
Operational instability was observed on October 8, 2025, including KMS
/getquote timeouts and HTTP 502 errors. A key production transaction
from January 9, 2025, on Arbitrum One demonstrates a functioning
pipeline, but its associated EAS UID was not retrievable via public
explorer UIs.

Key Evidence Patterns

  1. Transaction References: Contract addresses, transaction hashes, chain names
  2. Date Stamps: Always YYYY-MM-DD format with specific dates
  3. Governance References: "FIP-361", "Snapshot vote", proposal names
  4. Market Data: TVL figures with sources (DefiLlama, etc.) and capture dates
  5. Quantitative Claims: Always attributed ("self-reported", "management-stated", "verified by")
  6. Verification Status: Claims marked as "unverified", "pending confirmation", "verified"

8. Assessment Metrics Summary

Priority Levels

PriorityDescription
HIGHNear-Term Serious Threat / Blocker / Existential Risk
MEDIUMManageable but Important / Next-Stage Challenge
LOWNice to Have / Long-term Consideration

Complexity Levels

ComplexityResource Allocation
Low<10% of resources
Low-Medium5-15% of resources
Medium10-25% of resources
High25-50%+ of resources

Time Horizons

HorizonTimeframe
Short0-6 months / Immediate
Medium6-18 months / Next Stage
Long18+ months / Future

Value Multipliers (Opportunities)

ValueMultiplier
Low<1.5x
Medium1.5-2x
High2-3x+

9. "Cheat Sheet" Concept

There is no explicit "cheat sheet" section, but the following serve as quick-reference summaries:

1. Table of Contents

  • Provides full navigation with numbered section references

2. Executive Summary (QuestCorp style)

  • 1-2 page summary at start with:
    • Key Findings (bulleted critical issues)
    • Opportunities and Contingencies (conditions for success)

3. Assessment Summary Lines

Each issue/opportunity contains a one-line summary:

Assessment Summary: Complexity: High; Likelihood: High; Impact: Critical;
Priority: HIGH with the justification [reason].

Potential "Cheat Sheet" Implementation

For Report Viewer, consider generating:

  1. One-page Issue Matrix: All issues with Priority/Complexity/Impact in a table
  2. Opportunity Scorecard: All opportunities with Value/Effort/Testability ratings
  3. Key Questions Checklist: Extracted questions with checkbox status
  4. Evidence Index: Searchable list of all dates, transactions, and sources

10. Mapping to Research Tech Product Concepts

This section connects the analyzed report structure to the product concepts defined in the UX Master Brief.

Cheat Sheet Content (from existing reports)

The UX Brief defines the Cheat Sheet as showing "Top Risks, Top Opportunities, Key Questions to Ask." From the analyzed reports, these map to:

Cheat Sheet ElementSource in Current Reports
Top RisksSection 3: High Priority Issues (extract title + Assessment Summary)
Top OpportunitiesSection 5: High-Value Opportunities (extract title + Assessment)
Questions to AskSection 7: Questions for Project Team (Senna-style)

Extraction pattern for Cheat Sheet:

// Each Finding in the cheat sheet should include:
{
  statement: string,      // Issue/Opportunity title
  priority: 'HIGH' | 'MEDIUM' | 'LOW',
  timeHorizon: 'Short' | 'Medium' | 'Long',
  citationCount: number,  // Count of Supporting Evidence items
  hasConflict: boolean    // From Section 2.6 Conflicting Information
}

Evidence Drawer Content

The UX Brief specifies: "source URL, extracted snippet, timestamp, source category."

From analyzed reports, the Supporting Evidence sections contain:

  • URL/Reference: Transaction hashes, contract addresses, document page numbers
  • Snippet: Inline quotes or paraphrased findings
  • Timestamp: Dates in YYYY-MM-DD format (always present)
  • Source Type: Web (public data), Uploaded (data room docs), Integration (on-chain)

Key finding: Reports don't use footnote numbers. Citations are inline with dates. The Evidence Drawer should display the full "Supporting Evidence" paragraph with dates highlighted.

Conflict Detection Mapping

Section 2.6 "Conflicting Information" exists in all reports. Additionally, issues contain:

  • "Verification Status" markers (unverified, verified, pending)
  • Explicit conflict language: "sources disagree," "contradictory," "discrepancy"

For Review Queue: Extract items where:

  1. Verification Status = "unverified" or "pending"
  2. Text contains conflict indicators
  3. Confidence/Likelihood = "Low" or "Medium"

Progressive Disclosure Layers

The report structure naturally supports the "simple by default, depth on demand" principle:

LayerContentMaps to UX Brief
Level 0Issue/Opportunity titles onlyCheat Sheet bullets
Level 1+ Assessment Summary lineCheat Sheet with badges
Level 2+ Issue Description + Potential ImpactExpanded card view
Level 3+ Supporting Evidence + Verification MethodsFull detail / Evidence Drawer

Data Model Alignment

The report structure informs the TypeScript interfaces in tech-start.md:

// Additions based on report structure analysis:

interface Finding {
  id: string
  statement: string           // Issue/Opportunity title
  description: string         // Issue Description paragraph
  assessment: Assessment
  potentialImpact: string
  contributingFactors: string
  mitigationMeasures: MitigationMeasures
  supportingEvidence: EvidenceItem[]
  verificationMethods: string[]
  hasConflict: boolean
}

interface Assessment {
  timeHorizon: 'Short' | 'Medium' | 'Long'
  complexity: 'Low' | 'Low-Medium' | 'Medium' | 'High'
  likelihood: 'Low' | 'Medium' | 'High'
  impact: 'Low' | 'Medium' | 'High' | 'Critical'
  priority: 'LOW' | 'MEDIUM' | 'HIGH'
  justification: string
}

interface EvidenceItem {
  id: string
  text: string                // The evidence statement
  dates: string[]             // Extracted YYYY-MM-DD dates
  sourceType: 'web' | 'uploaded' | 'integration'
  sourceRef?: string          // URL, tx hash, or page number
  verificationStatus: 'verified' | 'unverified' | 'pending'
}

Questions Section (for "Questions to Ask" feature)

The Senna report's Questions section provides a template for generating meeting-ready questions:

Structure to preserve:

  1. The question itself (bold, actionable)
  2. Why it matters (1-2 sentences)
  3. What strong/weak answers look like (helps user interpret responses)
  4. Connection back to specific Issues/Opportunities

UX implication: Questions should link bidirectionally to their related Issues. Clicking a question shows which Issues it validates.


11. Recommendations for Report Viewer Design

Navigation Features

  • Collapsible Table of Contents with section jump links
  • Section numbering preserved (3.1, 3.2, etc.)
  • "Back to top" and "Previous/Next section" navigation

Content Display

  • Expandable/collapsible issue and opportunity cards
  • Assessment metrics displayed as visual badges (HIGH = red, MEDIUM = yellow, etc.)
  • Evidence sections highlighted or styled differently
  • Date stamps auto-formatted and potentially timeline-visualized

Search and Filter

  • Filter by Priority (HIGH/MEDIUM/LOW)
  • Filter by Time Horizon (Short/Medium/Long)
  • Search within Supporting Evidence sections
  • Quick-find for specific dates or entity names

Summary Views

  • Dashboard showing all issues by priority
  • Opportunity matrix with effort vs value plot
  • Questions checklist with completion tracking
  • Timeline view of all dated evidence

Export Options

  • Generate "Cheat Sheet" PDF with issue/opportunity matrices
  • Export questions as standalone checklist
  • Create evidence index document

12. Appendix: Files Analyzed

ReportPagesIssues (High/Other)Opportunities (High/Other)
Credora~459 / 106 / 7
QuestCorp375 / 05 / 0
RedStone~427 / 116 / 10
Senna~5010 / 53 / 3

Text extracts available at: clients/research-tech/example-reports/text-extracts/