How Should Companies Select Innovation Projects That Actually Deliver?

Murat Peksavaş – Senior Innovation Management Consultant
Most innovation programs fail not for lack of ideas but for lack of focus. The solution is a disciplined project selection system that guides idea intake toward the innovation strategy, applies transparent evaluation criteria, and balances a portfolio across incremental, adjacent, and radical bets. Close the loop with timely feedback to employees, then resource winning teams with coaching and stage-gated budgets.
Why start by narrowing scope instead of “collecting every idea”?
Open calls without direction overwhelm reviewers and frustrate employees—submissions pile up, responses lag, and motivation fades. Begin by publishing a simple, memorable innovation thesis (target customers, pain points, and strategic themes) and ask employees to submit proposals that explicitly map to it. This is not “limiting creativity,” it is converting creativity into relevance. Use an intake form that requires a clear problem statement, the evidence of need, and an initial hypothesis for value creation (time saved, defects reduced, revenue unlocked). Whether you collect via an intranet page or a form tool, the message must be consistent: ideas are welcome, but only those aligned with strategy will advance—so tell contributors what “fit” looks like.
How do you keep employees engaged when their ideas don’t advance?
Silence kills participation. Commit to a response service-level agreement (for example, acknowledge within 5 business days, decision within 30). For declined ideas, provide short, constructive rationale linked to the published themes (“out of scope for this cycle,” “insufficient evidence,” “duplicate of project X”). Offer a re-submission path—what additional data or reframing would make it viable? Publicly share aggregated stats each quarter (submissions, approvals, reasons for decline) so the system feels fair and transparent. This feedback loop sustains morale and steadily improves proposal quality, because employees learn how to frame problems, size opportunities, and anchor ideas in the company’s priorities.
What is a practical two-stage filter for proposals?
Stage 1 — Strategic Fit Filter: Does the idea directly serve one of the declared innovation themes or target pain points? If not, it exits with feedback. This keeps reviewers focused and reduces noise.
Stage 2 — Portfolio Mix Filter: Classify the remaining ideas by innovation type—incremental (improving existing offers/processes), adjacent (new customers or capabilities), radical (new business models or technologies). Before the year starts, set target proportions (e.g., 50% incremental, 40% adjacent, 10% radical), then accept ideas until each bucket is reasonably filled. This avoids an all-incremental pipeline and ensures leadership sponsors are mentally and financially ready for a few bolder bets.
Which criteria help separate promising projects from attractive distractions?
Once proposals clear the two filters, score them with a brief, evidence-oriented rubric: (1) Customer value & pain severity (who hurts, how much, how often), (2) Differentiation & moat (IP, data, access), (3) Feasibility & dependency risk (tech, compliance, data availability), (4) Time-to-first-signal (how quickly a PoC can validate assumptions), (5) Strategic leverage (re-usability of the platform, cross-BU potential). Keep scoring lightweight (1–5 scale) and require reviewers to cite the evidence used. External references on scoring and portfolio balance from sources like Harvard Business Review and MIT Sloan Management Review can help calibrate thresholds without over-engineering the process.
Who should make the decisions—and how fast?
Create a small Innovation Steering Committee (product, operations, finance, legal/IT) with a named executive sponsor. Decisions should be time-boxed (for example, 10 business days). Empower the committee to: approve PoCs up to a set cap, grant fast-track vendor onboarding for pilots, and guarantee data/site access. Publish the meeting cadence and outcomes on the intranet. Evidence moves money: if predefined success metrics are met in a PoC, the project automatically graduates to a larger rollout budget. Governance that is predictable and quick converts enthusiasm into action (see European Commission SME innovation guidance and McKinsey operating-model notes for reference patterns).
How many projects should go “upstairs,” and in what format?
Less is more. Elevate a short list—typically 10 to 15 proposals per cycle—that already passed the two filters and rubric thresholds. Each one-pager should include: problem statement, target user, baseline metric, proposed intervention, PoC design (6–12 weeks), success criteria with thresholds, data needs, risks, and team composition. This format keeps debate focused on assumptions and evidence, not slide aesthetics. A standard one-pager also makes decline decisions fairer: reviewers compare like with like, and contributors know what excellent looks like.
How do you fund and support the projects you select?
Use stage-gated funding: Explore → PoC → Limited Rollout → Scale.Allocate small, fast micro-budgets for exploration (discovery interviews, prototyping), then release larger amounts upon meeting exit criteria. Pair every team with a mentor (internal or external) who meets bi-weekly to unblock access, sharpen experiments, and prepare decision artifacts. Provide “paved paths” for legal, security, and procurement so PoCs can run in real environments. OECD capability-building work emphasizes that enablement functions must measure themselves by how many teams they helped ship safely—adopt that standard internally.
What metrics prove your selection system works?
Track a balanced set across the lifecycle. Early: time to first decision, time to first test, share of proposals with validated problem evidence. Mid: PoC conversion rate to rollout, average time from approval to data access, per-site payback at limited rollout. Late: run-rate value (revenue or cost variance), quality/safety improvements, carbon or compliance gains, and percentage of solutions reused across business units. Review these monthly; publish highlights and lessons learned. MIT Sloan and HBR literature underline that visibility—paired with fast decisions—sustains momentum and increases program credibility.
What about communications and recognition?
Celebrate learning, not just wins. Share two short stories per quarter: one successful rollout and one “smart kill” that saved time and money. Recognize site leaders who hosted PoCs, not only idea submitters. Offer a light incentive (badge, small bonus, or career credit) for mentors and evaluators who consistently improve cycle time and decision quality. This broadens ownership of innovation beyond a single department and embeds it as an organizational habit rather than a campaign.
FAQ
Isn’t narrowing scope limiting creativity? No. It channels creativity toward the problems that matter most, raising the odds of adoption and impact.
How many radical projects should we carry? Set a small, explicit slice (often ~10%) so bold bets exist without crowding out near-term gains.
What if we get too many similar ideas? De-duplicate early, merge teams where possible, and select the variant with the strongest evidence and fastest path to a PoC.
Key Takeaways
Publish an explicit innovation thesis so idea intake is guided—not random.
Run a two-stage filter: strategic fit first, portfolio mix second (incremental/adjacent/radical).
Score short-listed ideas with a simple rubric focused on evidence and time-to-signal.
Decide fast with an empowered committee; let evidence trigger budget escalation.
Fund in stages, mentor teams, and provide paved paths for PoCs to reach real settings.
Measure learning, conversion, and value—then communicate wins and “smart kills.”
References
Harvard Business Review — Idea selection, portfolio balance, and decision governance.
MIT Sloan Management Review — Experimentation portfolios and stage-gated funding.
OECD — Innovation capability and measurement frameworks.
European Commission — SME innovation policy and program design guidance.
McKinsey — Operating models for digital and innovation programs.