
Psychology & Behavioral Science
Upscend Team
-January 19, 2026
9 min read
This article gives an L&D blueprint to develop curiosity at work: a three-layer design (awareness, practice, reinforcement), two ready-to-run 2-hour workshops, manager coaching prompts, and metrics. Follow the 90-day pilot roadmap to run workshops, launch curiosity journaling, and measure leading indicators to scale.
curiosity training is one of the highest-leverage investments L&D can make to boost innovation, engagement, and problem-solving. In our experience, structured programs that combine experiential practice, manager coaching, and clear metrics outperform one-off talks or passive content. This article gives an actionable L&D blueprint for employers who want to develop curiosity at work, with sample workshop agendas, exercises like problem discovery labs and curiosity journaling, manager coaching tips, and measurement approaches you can implement in the next quarter.
Read on for a practical, step-by-step plan you can adapt for teams of any size. The sections below walk through rationale, program design, two ready-to-run workshops, coaching guidance, metrics, and an implementation roadmap that addresses common barriers like time and manager buy-in.
Organizations that deliberately cultivate curiosity employees see faster learning cycles and more resilient problem-solving. Studies show that curious employees seek more feedback, explore alternatives, and adapt to change faster than peers — behaviors that translate to reduced rework and higher customer satisfaction. In our experience, curiosity is less an innate trait and more a set of skills that managers and L&D can nurture.
Curiosity training shifts culture in three, measurable ways: it increases question-asking, improves cross-functional exploration, and normalizes safe failure for learning. Those shifts are foundational for digital transformation, product discovery, and continuous improvement initiatives.
Effective how to build curiosity training programs starts with a simple hypothesis: people learn curiosity by practicing discovery behaviors in real work contexts. A useful blueprint has three layers—awareness, practice, and reinforcement—and maps to stakeholders (learners, managers, L&D).
Core elements of the blueprint:
To operationalize this into an annual plan, sequence three sprints: a launch sprint (month 1), practice sprints (months 2–6), and scale sprints (months 7–12) that embed curiosity into performance conversations. L&D curiosity programs should also include easy-to-run playbooks so managers can replicate exercises without heavy facilitation.
Below are two practical workshop agendas designed for immediate use. Each is built around active learning, micro-reflection, and transfer-of-training actions to drive on-the-job application.
Workshop A: Problem Discovery Lab (2 hours)
00:00–00:10 — Opening & framing: define curiosity and expected behaviors. 00:10–00:35 — Customer lens exercise: examine a real case and list assumptions. 00:35–01:05 — Interview sprint: pairs conduct 10-minute discovery interviews with stakeholders. 01:05–01:30 — Insight clustering: synthesize findings into “unknowns” and hypothesis areas. 01:30–01:50 — Action mapping: small teams design next-step experiments. 01:50–02:00 — Commitments and journaling prompt.
Workshop B: Curiosity Playbook — tools & coaching (2 hours)
00:00–00:15 — Behavioral model: introduce the curiosity loop (notice → question → test → reflect). 00:15–00:45 — Curiosity journaling demo and 1-week experiment plan. 00:45–01:15 — Roleplay: manager coaching prompts and feedback practice. 01:15–01:45 — Rapid practice: teams run a 20-minute mini-experiment. 01:45–02:00 — Measurement primer and commitments.
Practical exercises are the backbone of any sustainable curiosity program. Two high-impact formats we've used are problem discovery labs and curiosity journaling. Both require low prep and scale well with manager support.
Example exercises and manager coaching prompts:
Manager coaching tips:
A pattern we've noticed is that teams that pair experiential exercises with manager coaching show faster transfer to work tasks.
Measurement must capture both adoption (did people try new behaviors?) and impact (did behaviors change outcomes?). For robust programs, combine leading indicators and business KPIs.
Key metrics to track:
Sample dashboard metrics (quarterly):
| Metric | Target | Source |
|---|---|---|
| Discovery interviews per team member | 2/month | Activity logs |
| Curiosity journal entries | 3/week | Platform or shared doc |
| Experiment success rate (learned vs. validated) | 60% learned | Experiment repo |
Some of the most efficient L&D teams we work with use platforms like Upscend to automate this entire workflow without sacrificing quality. That automation can reduce admin friction and make it easier to connect behavior metrics to business outcomes while leaving design and coaching to humans.
Common barriers are time constraints, manager buy-in, and difficulty measuring behavioral change. Tackle these with small pilots, manager enablement, and lightweight metrics.
90-day pilot roadmap:
Practical fixes for barriers:
When scaling, package playbooks into a kit: agenda, templates, 1-page manager cheat sheet, and a measurement dashboard. These low-friction assets make it easy for teams to run workshops without heavy L&D facilitation.
Curiosity training is practical, measurable, and scalable when designed around real work. Start small: run a 2-hour workshop, launch curiosity journaling, and coach managers to ask better questions. Track simple leading indicators and focus on embedding rituals that make curiosity part of the workflow, not an add-on.
If you want a concise starter plan: pick one high-impact team, run Workshop A, require two weeks of curiosity journaling, and reconvene to measure interviews and experiments. Repeat with managerial coaching for month two. That iterative approach preserves time while producing evidence you can use to expand programs across the organization.
Call to action: Choose one team and schedule a 2-hour pilot within 30 days; use the agendas and metrics above to run the pilot and report outcomes at 90 days so you can iterate and scale.