LINCHPIN
SEED
IMPLEMENTATION TOOLKIT

SEED
IMPLEMENTATION
TOOLKIT

The practitioner's guide to implementing SEED's three-domain alignment model. Designed for community leaders, school administrators, and workforce development professionals.

Author Damon Gardenhire
Published March 2026
Series SEED Alliance Working Papers
Version 1.0
01

How to Use This Toolkit

This toolkit is a field manual. It assumes you already understand why your community needs aligned systems across education, workforce, and civic life. It assumes you are past the white paper stage. You are ready to build.

The SEED model—Society, Education, Economic Development—is a systems lab for family-rooted generational flourishing. It does not run permanent programs. It generates institutions: organizations designed to outlast the initial investment, serve families across generations, and operate independently within three to five years of launch. This toolkit walks you through that process from first assessment to final spin-off.

Who This Toolkit Is For

Superintendents and CTE directors mapping career-connected pathways. Chamber presidents building employer pipelines. Workforce board chairs measuring real outcomes. Community foundation leaders funding systems change instead of scattered programs. City managers seeking integrated strategy across education, employment, and family stability.

How This Document Is Organized

Sections 3 through 9 follow the SEED implementation lifecycle in sequence. Each section contains background context, step-by-step instructions, and practical tools—checklists, templates, and assessment grids you can photocopy and use immediately. Section 10 presents two case studies from SEED's own portfolio. Section 11 compiles all tools and templates into a single reference section.

A Note on Sequence

The phases are presented linearly, but reality is iterative. You will return to community assessment after convening reveals new data. You will revisit idea-shaping after a pilot surfaces unexpected constraints. The sequence is a scaffold, not a railroad track. Use it to orient yourself, not to limit your responsiveness.

Every section answers one question: What do I actually do? Where research informs the guidance, citations are provided as superscripts. But this is not an academic paper. It is a builder's guide.

02

The SEED Model at a Glance

SEED operates at the intersection of three domains, using three functions, across a defined lifecycle. Every initiative SEED designs must touch all three domains. Single-domain interventions—a workforce program that ignores family stability, an education reform that ignores employer demand—are precisely the fragmented approaches SEED exists to replace.1

Three Domains

Society

Mediating institutions, family stability, civic trust, social cohesion. The connective tissue that makes communities function.

Key question: Are families and institutions strong enough to sustain gains?
Education

Formation and credentialing, K-12 through postsecondary. Not just degree attainment but character development, skill mastery, and purposeful preparation.

Key question: Are young people being formed, not just processed?
Economic Development

Workforce pipelines, employer partnerships, wage growth, apprenticeship infrastructure. Prosperity tied to real employers with real demand.

Key question: Are jobs connected to training connected to people?

Three Functions

I Convene Build the table. Bring together the leaders who control resources, policy, and implementation across all three domains.
II Shape Ideas Move from conversation to design. Study models, stress-test with practitioners, draft implementable plans.
III Build Institutions Pilot, measure, refine, and spin off independent organizations designed to outlast SEED's involvement.

The Lifecycle

Every SEED initiative follows the same arc: identify a need through community assessment, design a response through research and stakeholder engagement, pilot the design with a defined population and timeline, measure the impact against pre-established baselines, and spin off the resulting institution with independent governance, diversified funding, and its own legal identity. SEED generates institutions, not permanent programs. The goal is always self-sustaining, community-owned capacity.

The SEED Litmus Test

Before investing in any initiative, SEED applies four quality gates. Every proposed intervention must pass all four to proceed.

  1. Replicable. Can this model be adapted to other communities without heroic local effort?
  2. Measurable. Can we define success in quantitative terms before we begin?
  3. Spinnable. Can this become an independent institution within three to five years?
  4. Family-centered. Does this serve families across generations, not just individuals in isolation?
03

Phase 1: Community Assessment

Before convening a single meeting, you need a clear-eyed picture of your community's three-domain health. This is not an academic exercise. It is reconnaissance. You are looking for the specific gaps where misalignment between education, workforce, and civic life is producing measurable harm—and the specific assets you can build on.

Expect this phase to take four to six weeks. It requires one dedicated person (or a small team) with access to public data, relationships with local institutions, and the authority to ask uncomfortable questions.

Society Domain Assessment

The society domain is the hardest to quantify and the easiest to neglect. But without strong mediating institutions—churches, civic organizations, mutual-aid networks, functioning families—no workforce program or education reform will hold.2 Gains made in school or the labor market dissipate when the social fabric cannot sustain them.

  • Identify the five largest faith congregations in your target geography. Are their leaders engaged in civic life beyond their walls?
  • Count the active civic organizations (Rotary, Lions, Kiwanis, neighborhood associations) and their combined membership. Is the trend growing or declining?
  • Pull single-parent household rates from Census ACS for your county and compare to state and national averages.
  • Assess voter participation rates in the last three municipal elections. Below 15% signals severe civic disengagement.
  • Identify whether a community foundation, United Way, or equivalent coordinating philanthropy exists. If yes, determine their total annual grantmaking budget.
  • Survey the availability of family-serving infrastructure: licensed childcare slots per capita, domestic violence shelter capacity, family counseling providers accepting Medicaid.

Education Domain Assessment

The education domain assessment focuses on two questions: Are students being prepared for productive adult life? And is the education system connected to actual employer demand? You are not auditing curriculum. You are mapping the pipeline from kindergarten to career.

  • Pull third-grade reading proficiency rates for every elementary school in your target geography. Below 30% proficient is a crisis indicator.3
  • Map every Career and Technical Education (CTE) pathway offered in your district(s). For each pathway, identify whether a local employer has formally committed to hiring completers.
  • Determine the high school graduation rate, disaggregated by race, income, and special education status.
  • Identify postsecondary credential attainment rates for adults 25-64 in your county (Census ACS, Table S1501).
  • Count the number of active apprenticeship programs registered with your state's Department of Labor.
  • Determine whether any career-connected learning requirement exists in state or district policy (e.g., work-based learning hours, industry certification incentives).
  • Identify the five largest employers within a 30-minute commute of your target schools. For each, determine whether they have any formal relationship with K-12 or postsecondary institutions.

Economic Development Domain Assessment

  • Pull unemployment rates (BLS, LAUS) for your county, disaggregated by age and race where available.
  • Identify the top ten employers by headcount in your region using state workforce data or chamber of commerce records.
  • Determine the median household income and compare to your state's cost-of-living-adjusted median (MIT Living Wage Calculator provides local benchmarks).
  • Map open positions by industry sector using real-time labor market data (Lightcast/EMSI, state job bank, Indeed postings).
  • Identify the sectors with the highest projected growth over the next ten years (BLS Occupational Outlook, state workforce projections).
  • Assess the presence or absence of a sector partnership or industry council that convenes employers around shared workforce needs.
  • Determine whether your region has a workforce development board with an active strategic plan. If yes, when was it last updated?

Data Sources Reference

Data Need Source Update Frequency
Family structure, income, education attainment Census ACS 5-Year Estimates Annual (December)
Child well-being indicators KIDS COUNT Data Center (Annie E. Casey) Annual
Employment, unemployment, wages BLS Local Area Unemployment Statistics Monthly
K-12 achievement and graduation rates State Department of Education report cards Annual (fall)
CTE enrollment and completion Perkins V state data / OCTAE Annual
Real-time labor demand Lightcast (formerly EMSI/Burning Glass) Continuous
Occupational projections BLS Occupational Outlook Handbook Biennial
Civic participation County election board records Per election cycle
Living wage benchmarks MIT Living Wage Calculator Annual
04

Phase 2: Convening

Convening is the most underestimated function in community change. Most regions have the right people—they just never sit in the same room at the same time with a shared agenda and shared data. SEED's convening function is not a networking event. It is a structured process for building a working table where decisions get made and commitments get honored.

Who Must Be in the Room

The table must include operational leaders—people who control budgets, hire staff, and set policy—from all three domains. Advisory voices are welcome later. The founding table requires decision-makers.

Sector Required Roles Why They Matter
K-12 Education Superintendent or designee, CTE Director Controls curriculum, scheduling, and student placement. Without K-12, no pipeline exists.
Higher Education Community college president or VP of workforce, tech center director Controls credential programs, dual enrollment, and industry certification pathways.
Business Chamber president, 2-3 anchor employers (HR leads), small business coalition representative Defines actual demand. Without employers, training has no destination.
Faith / Civic 2-3 pastoral leaders, interfaith council chair (if one exists) Commands trust in communities that institutions have lost. Reaches families government cannot.
Government Mayor or city manager, workforce board chair or director Controls public investment, zoning, incentive structures, and regulatory alignment.
Philanthropy Community foundation program officer, United Way director Provides flexible capital. Can fund the gap between public dollars and real costs.
The "No Spectators" Rule

Every person at the table must have operational authority within their institution and a willingness to commit resources—staff time, budget, or policy change. Observers, designees-of-designees, and attendees sent "to listen and report back" erode the table's credibility. If someone cannot make decisions, they should not be seated until they can. This is a working body, not a listening session.

Meeting Cadence and Structure

The convening table meets monthly for the first six months, then transitions to quarterly once working groups are established. Each meeting follows a disciplined agenda structure designed to prevent the two most common failures of community coalitions: drift into abstract conversation and retreat into sector-specific silos.4

  1. Data review (15 minutes). One page of updated community indicators, drawn from your Phase 1 assessment. No narration—the data speaks. Table members read silently, then react.
  2. Cross-domain spotlight (20 minutes). One member presents a challenge or opportunity from their sector that requires action from another sector at the table. The format forces interdependence.
  3. Working group reports (20 minutes). Each active working group delivers a 5-minute status report using a standardized template: what we committed to, what we accomplished, what we need.
  4. Decision block (15 minutes). Any decision requiring table-wide authorization is presented, debated, and voted. Decisions are recorded. Abstentions are noted. No decision is deferred more than once.
  5. Commitments (5 minutes). Each member states one concrete action they will take before the next meeting. Commitments are read aloud, recorded, and reviewed at the opening of the next session.

First Meeting Agenda Template

Inaugural Convening — Suggested Agenda (2 hours)
Welcome & purpose 15 min — SEED lead frames the why. Not the org chart, not the grant. The problem.
Community data briefing 20 min — Present the Phase 1 assessment findings. Three domains, one page each. Let the data do the talking.
Table introductions 20 min — Each member: name, role, one sentence on what they can commit. Not bios. Commitments.
Gap identification 30 min — Facilitated discussion: Where is the alignment broken? Where do education, workforce, and civic systems fail to connect?
Working group formation 20 min — Identify 2-3 priority gaps. Assign cross-sector teams of 3-5 members to each. Set 60-day deliverables.
Cadence & commitments 15 min — Agree on meeting schedule. Each member states one action before next meeting.
05

Phase 3: Idea-Shaping

Most community coalitions stall between "we all agree there's a problem" and "here's what we're going to build." The SEED idea-shaping process is designed to close that gap. It is a structured research and design sprint that moves a convening table from shared concern to implementable plan in 90 days.

The process has five stages. Each stage has a defined deliverable. No stage is skipped.

1 Identify the Gap Name the specific misalignment. Not "education needs improvement." Something measurable.
2 Study Models Find 3-5 examples of other communities that have addressed this gap. Include at least one international model.
3 Stress-Test Present findings to local practitioners. Ask: What won't work here? What are you missing?
4 Draft Design Write a one-page design brief specifying population, intervention, timeline, and expected outcomes.
5 Stakeholder Review Return to the full convening table for formal review, revision, and authorization to pilot.

Stage 1: Identify the Gap

The gap must be stated in specific, measurable language that connects at least two of the three SEED domains. "Our students aren't career-ready" is a complaint. "Only 12% of our high school CTE completers are placed into jobs in their field of study within six months of graduation, while local manufacturing employers report 340 unfilled positions requiring the same credentials" is a gap statement. The difference matters because the gap statement contains the seed of its own solution.

Gap Statement Template
The problem [Specific measurable condition]
Affected population [Who, how many, where]
Domains involved [Society / Education / Economic Development]
Current cost of inaction [Economic, social, or human cost of the status quo]
What alignment would look like [Describe the connected state you are trying to create]

Stage 2: Study Models

Research is not optional and it is not delegated to interns. The working group responsible for this gap spends two to three weeks identifying communities, states, or countries that have made measurable progress on a similar challenge. SEED maintains a curated library of model programs, but every implementation context is unique. You are looking for structural patterns, not copy-paste solutions.5

For each model studied, document: the intervention design, the population served, the funding structure, the measured outcomes, the timeline from launch to measurable results, and—critically—what did not work and why. Failures are more instructive than successes.

Stage 3: Stress-Test with Practitioners

Present your model research to the people who would actually implement the proposed design. This means teachers, not just superintendents. Foremen, not just HR directors. Caseworkers, not just agency heads. The practitioner stress-test is designed to surface the operational constraints that leaders often cannot see: scheduling conflicts, union rules, licensing requirements, funding restrictions, seasonal patterns, transportation barriers, and the hundred other granular realities that determine whether a good idea survives contact with the field.

Stress-Test Protocol

Present the proposed model in a 10-minute briefing. Then ask three questions and stop talking: (1) What would prevent this from working in our community? (2) What are we not seeing? (3) If you had to make this work with your current resources, what would you change? Record every response. Do not defend the design during this session.

Stage 4: Draft the Design Brief

The design brief is a one-page document—front and back, maximum—that specifies every essential parameter of the proposed pilot. It is not a grant proposal. It is an engineering document. Anyone reading it should know exactly what will be built, for whom, over what timeline, and how success will be measured.

Stage 5: Stakeholder Review

The full convening table reviews the design brief, asks questions, proposes modifications, and ultimately votes to authorize the pilot. Authorization means commitment: member institutions agree to contribute the resources specified in the design brief. This is the moment where coalition becomes partnership. If the table cannot authorize, the design returns to Stage 3 for revision. No pilot proceeds without explicit authorization and resource commitment from the convening table.

Quality Gates for Idea-Shaping

Before any idea advances to pilot, it must satisfy all four of the SEED quality gates: Is it replicable beyond this community? Is it measurable with available data infrastructure? Can it spin off as an independent institution? Does it serve families, not just individuals? If any gate fails, the design is revised until all four pass.

06

Phase 4: Pilot Design

A pilot is not a demonstration project. It is a controlled test of whether a designed intervention produces measurable outcomes in a defined population over a sufficient timeline. Every SEED pilot follows the same structural parameters, regardless of the specific intervention being tested. These parameters are non-negotiable because rigorous evaluation is impossible without them.

Structural Parameters

Parameter SEED Standard Rationale
Population size 50–200 participants minimum Below 50, statistical significance is unreliable. Above 200 in a pilot, operational complexity outpaces learning.
Geographic boundary Single school district, single county, or defined neighborhood Multi-jurisdiction pilots introduce governance complexity that obscures program effects.
Timeline Minimum 3 years Year 1 is launch. Year 2 produces first meaningful data. Year 3 validates trends. Anything shorter is anecdote.6
Staffing Dedicated project director (0.5 FTE minimum) + evaluation lead Pilots managed "on top of" existing duties fail. Protected staff time is the single highest predictor of pilot success.
Control methodology Comparison group or matched cohort Without a comparison population, you cannot isolate program effects from secular trends.

Budget Framework

Every SEED pilot budget includes five mandatory line items. The specific dollar amounts vary by intervention type, but the categories are fixed. Budgets that omit any of these categories are returned for revision.

  1. Personnel. Project director, evaluation lead, and any direct-service staff. Typically 55–65% of total budget.
  2. Evaluation. Independent evaluator, data collection instruments, IRB fees, analysis. Minimum 10% of total budget—non-negotiable.7
  3. Program delivery. Materials, technology, facility costs, participant support (transportation, childcare, stipends). Typically 15–25% of total budget.
  4. Convening. Working group meetings, stakeholder communication, community engagement. Typically 5% of total budget.
  5. Sustainability planning. Spin-off preparation, governance development, fundraising infrastructure. Minimum 5% of total budget beginning in Year 2.

Evaluation Partnership

Every SEED pilot engages an independent evaluation partner—a university research center, policy institute, or qualified evaluation firm—before the pilot launches. The evaluator participates in design, establishes the measurement framework, collects baseline data, and conducts interim and final analyses. The evaluator is not a consultant hired after the fact to validate results. They are a design partner who ensures the pilot produces evidence, not just stories.

If your pilot involves human subjects research (which most SEED pilots do, given that they collect individual-level data on participants), you will need Institutional Review Board (IRB) approval. University evaluation partners typically manage this process through their own IRB. Budget four to eight weeks for IRB review before participant enrollment begins.

Pilot Design Canvas
Intervention name  
Gap addressed [from Phase 3 gap statement]
Target population [who, how many, selection criteria]
Geography [district, county, or neighborhood]
Duration [start date through end date, minimum 3 years]
Total budget  
Funding sources [confirmed and prospective]
Project director [name, FTE allocation]
Evaluation partner [institution and lead researcher]
Comparison method [matched cohort, waitlist control, historical comparison]
Primary outcomes (3 max)  
Domains served [Society / Education / Economic Development]
Spin-off target date [Year 3, 4, or 5]
07

Phase 5: Measurement

SEED measures across all three domains simultaneously. A workforce program that increases employment but destabilizes families has not succeeded. An education reform that raises test scores but disconnects students from their community has not succeeded. Three-domain measurement is harder than single-domain measurement. It is also the only kind that tells you whether you are actually building generational flourishing or just moving metrics.

Society Metrics

Metric Data Source Collection Cadence
Civic participation rate Voter registration and turnout data, volunteer hours logged with partner organizations Annually + post-election
Social trust index Participant survey (validated instrument: General Social Survey trust battery)8 Baseline + annually
Family stability indicators Household composition, housing stability (same address at 12-month intervals), child welfare referrals Baseline + annually
Institutional engagement Membership or participation in faith, civic, or community organizations (self-reported) Baseline + annually
Perceived community safety Participant survey + crime statistics for target geography Baseline + annually

Education Metrics

Metric Data Source Collection Cadence
K-3 reading proficiency State assessment data (school-level) Annually (spring)
CTE pathway enrollment and completion District CTE office, Perkins V data Annually
Industry credential attainment Credentialing bodies (NCCER, AWS, CompTIA, etc.) + district records Annually
High school graduation rate State education data, disaggregated Annually
Postsecondary enrollment (within 12 months) National Student Clearinghouse Annually (fall following graduation)
College/career placement rate State longitudinal data system or follow-up survey 6 and 12 months post-graduation

Economic Development Metrics

Metric Data Source Collection Cadence
Employment rate (participants) State wage records (UI data), participant self-report Quarterly (wage records), annually (survey)
Median earnings State wage records, adjusted for inflation Quarterly
Employer satisfaction Employer survey (custom instrument, validated against NAM/BRT frameworks) Annually
Apprenticeship enrollment and completion State registered apprenticeship data, employer records Annually
Job placement in field of training Participant follow-up survey + employer verification 6 and 12 months post-completion
Earnings growth trajectory Longitudinal wage records (Year 1, 2, 3 post-program) Annually, longitudinal

The Family Flourishing Index

SEED's composite metric—the Family Flourishing Index (FFI)—integrates indicators across all three domains into a single score that tracks generational progress for participating families. The FFI is not a replacement for domain-specific metrics. It is a synthesizing instrument that answers the question: Is this family, taken as a whole, better positioned for generational flourishing than it was at baseline?9

Family Flourishing Index Components

Economic security (25%): Household income relative to living wage, employment stability, savings/debt ratio.

Educational attainment (25%): Highest credential in household, children's grade-level proficiency, postsecondary enrollment history.

Family stability (25%): Housing stability, household composition continuity, child welfare system involvement (inverse).

Civic and social connection (25%): Organizational membership, social trust score, voter participation, neighborhood engagement.

Scoring: Each component scored 0–100 based on national benchmarks. Composite FFI = weighted average. Target: 15-point gain over 3-year pilot.
Data Collection Cadence

Baseline data is collected before the intervention begins—never retroactively. Annual data collection occurs at the same point each year (SEED standard: September, to align with school calendars and fiscal years). Quarterly wage data is pulled from state systems with appropriate data-sharing agreements in place before the pilot launches. No data, no pilot.

08

Phase 6: Spin-Off

SEED does not build permanent programs. It builds institutions. The distinction is existential: a program depends on its funder; an institution serves its community. Every SEED pilot is designed from day one to become an independent organization within three to five years of launch. The spin-off phase is not an afterthought appended to a successful pilot. It is engineered into the pilot from the beginning.

The 3-Year Independence Timeline

Year Milestone SEED Role
Year 1 Pilot launches. Governance advisory board formed. Legal structure explored. SEED provides 80–100% of operational leadership. Primary operator
Year 2 Governance board formalized. Independent fundraising begins. Local executive director identified and onboarded. SEED provides 50–70% of operational leadership. Co-operator / coach
Year 3 Legal entity formed (501(c)(3) or appropriate structure). Diversified funding secured (no single source > 40%). Local leadership managing daily operations. SEED provides 10–20% advisory support. Advisor / evaluator
Year 4–5 Full independence. SEED maintains evaluation partnership only. Brand transfer complete. Annual check-in for network membership. Network partner

Governance Formation

An independent board of directors must be seated by the end of Year 2. Board composition should reflect the cross-sector nature of the intervention: at minimum one member from education, one from business/workforce, one from the faith or civic community, one with financial management expertise, and one community member from the population served. SEED retains one advisory seat (non-voting) through Year 3.

Funding Diversification

No independent institution survives on a single funding stream. By the time of spin-off, the organization must have secured funding from at least three distinct sources. The "40% rule" applies: no single funder contributes more than 40% of the annual operating budget. This forces diversification and prevents the organization from becoming a dependent subsidiary of any single patron.10

Recommended Funding Mix at Spin-Off

Government contracts or grants (25–35%): Workforce board funding, state education grants, federal program dollars (Perkins, WIOA, TANF).

Philanthropic grants (20–30%): Community foundations, national foundations, corporate giving programs.

Earned revenue (15–25%): Fee-for-service, employer-paid training, tuition (where applicable), social enterprise.

Individual giving (10–20%): Annual fund, major gifts, events. The hardest to build but the most resilient.

Legal Structure

Most SEED spin-offs incorporate as 501(c)(3) public charities. However, the legal structure should match the institution's revenue model and mission. Workforce training entities with significant earned revenue may benefit from a hybrid structure (e.g., 501(c)(3) + LLC subsidiary). Educational institutions may require charter authorization or state licensing. SEED provides legal consultation during Year 2 to determine the optimal structure.

Brand Transfer

SEED-incubated institutions typically launch under SEED branding and transition to their own identity during the spin-off process. Brand transfer includes: developing the organization's own name and visual identity, establishing independent digital presence (website, social media), transitioning public communications, and formalizing the relationship as a "SEED Alliance member" rather than a "SEED program."

Spin-Off Readiness Checklist

  • Independent board of directors seated with fiduciary responsibility
  • Legal entity formed and in good standing with Secretary of State
  • 501(c)(3) determination letter received (or appropriate tax status established)
  • Independent bank accounts and financial controls in place
  • Annual budget approved by board with no single funder exceeding 40%
  • Executive director hired and managing daily operations
  • Three-year strategic plan adopted by board
  • Independent evaluation contract in place (may continue with original evaluator)
  • Brand identity developed and deployed across all public materials
  • Data systems and participant records transferred to independent infrastructure
  • SEED advisory seat formalized (non-voting, annual check-in)
  • Memorandum of understanding signed governing the SEED Alliance membership
09

Case Studies

NOTE ON CASE STUDIES

The following case studies are projected implementation scenarios based on SEED's program design, comparable program outcomes from similar initiatives nationally, and the three-domain alignment model described in this toolkit. They illustrate how the model is designed to function in practice. Tekton Academies and Ready Set Thrive are in development; neither has yet enrolled students or awarded grants.

Case Study 1: Tekton Academies

A CLASSICAL MICROSCHOOL FOR OPPORTUNITY YOUTH

Tekton Academies emerged from a gap identified in SEED's initial community assessment: a population of young adults aged 16 to 24 who were neither enrolled in school nor employed—the cohort federal workforce policy calls "opportunity youth" and local communities call "disconnected." In Oklahoma, this population numbered approximately 80,000 in 2024, representing roughly $2.5 billion in annual lost productivity and future fiscal costs.11

The existing response landscape offered two options, both inadequate. Traditional GED programs provided credentialing without formation—a diploma mill that restored no purpose, built no character, and connected to no employer. Job Corps and similar residential programs provided intensive services but at costs exceeding $30,000 per participant annually, with completion rates below 40%.12

Tekton was designed to fill the space between these two failures. The model combines classical education principles—the cultivation of virtue, mastery of foundational knowledge, formation of character—with skilled trades training connected to actual employer demand. The name itself is deliberate: tekton, the Greek word for craftsman, the same word used to describe the vocation of Joseph and Jesus in the gospels. It signals that craft is dignified work, not a consolation prize for those who failed at academics.

Tekton Design Parameters

Population: Opportunity youth ages 16–24, neither enrolled nor employed. Priority given to those with family responsibilities.

Structure: Microschool cohorts of 12–15 students. Two-year program. Half-day classical academics, half-day skilled trades (construction, welding, HVAC, electrical).

Formation: The 10 Virtues curriculum—a character development framework drawn from classical and theological sources, adapted for secular delivery. Weekly seminar, daily practice.

Employer connection: Apprenticeship placement with partner contractors beginning in semester three. Employer advisory board reviews curriculum quarterly.

Three-domain integration: Education (classical curriculum + credentials), Economic Development (trades training + employer placement), Society (virtue formation + family support services).

What comparable programs suggest. The Tekton model is designed to test three hypotheses. First, that opportunity youth respond to high expectations—evidence from classical microschool models nationally indicates that rigorous curricula (reading primary texts, engaging in Socratic dialogue, writing argumentative essays) produce engagement rates higher than vocational training alone. Second, that employer partnerships work best when employers help design the curriculum, not just receive its graduates—a pattern documented in career-connected learning programs like PEAK Innovation Center and Blue Valley CAPS. Third, that a two-year timeline is sufficient for full independence. Comparable workforce programs suggest that family instability—housing disruptions, childcare gaps, and legal issues—causes more program interruptions than academic difficulty. Tekton's design includes a family navigator position from the outset, a dedicated staff member who connects student families with existing social services. Evidence from navigator programs in similar populations suggests this addition can reduce disruptions by approximately 30–40%.

Case Study 2: Ready Set Thrive

CAREER-CONNECTED HIGH SCHOOLS VIA SEED GRANTS

Ready Set Thrive addresses a different gap: the disconnect between what high schools teach and what employers need, at the system level rather than the individual level. Where Tekton serves opportunity youth who have already fallen out of the system, Ready Set Thrive intervenes within the system itself—redesigning how high schools connect academic learning to career pathways.

The mechanism is a competitive seed grant program. High schools apply for grants of up to $225,000 over three years to design and implement career-connected innovation centers—physical spaces within the school building where students work on projects defined by local employer partners, earn industry credentials alongside academic credit, and build portfolios of applied work that serve as both college applications and job applications.

Ready Set Thrive Selection Criteria

Superintendent commitment: Superintendent must co-sign the application and commit to sustaining the innovation center beyond the grant period.

Employer partnership: Minimum three local employers must commit to: defining project briefs, mentoring students, and interviewing completers for employment or apprenticeship.

CTE integration: The innovation center must be integrated into the district's CTE pathway structure, not operated as a standalone club or extracurricular.

Measurement plan: Applicants must specify how they will track: student enrollment by demographics, credential attainment, and post-graduation placement.

Priority: Applications from rural districts, districts with graduation rates below 85%, and districts serving high proportions of economically disadvantaged students receive weighted scoring.

Projected indicators. Based on outcomes from comparable career-connected learning programs (PEAK Innovation Center, Blue Valley CAPS, NAF academies), SEED projects that funded schools will see measurable increases in CTE pathway enrollment (comparable programs report 15–25% increases in Year 1), higher student attendance on innovation center days compared to traditional schedule days, and strong employer engagement exceeding the minimum three-employer commitment. The key success metric: whether participating superintendents include innovation center operating costs in their district's base budget following grant expiration, signaling the kind of institutional adoption that makes spin-off unnecessary—the school itself absorbs the model.

10

Tools & Templates

This section compiles the assessment instruments, planning templates, and evaluation tools referenced throughout this toolkit. Each tool is designed to be used as-is or adapted to your local context. They are deliberately simple—one page each, minimal jargon—because the people who will use them are busy and already skeptical of consultants bearing binders.

Tool 1: Community Assessment Worksheet

Three-Domain Community Assessment
Community name  
Assessment date  
Lead assessor  
Domain Indicator Local Value State Avg. Status
Society Single-parent household rate
Society Municipal voter turnout
Society Active civic organizations (count)
Education 3rd-grade reading proficiency
Education High school graduation rate
Education CTE pathway completion rate
Economic Dev. Unemployment rate
Economic Dev. Median household income
Economic Dev. Active apprenticeship programs (count)

Tool 2: Convening Agenda Template

Monthly Convening Table — Standard Agenda
Date  
Location  
Facilitator  
Members present  
1. Data review (15 min) One-page indicator update. Silent read, then reactions.
2. Cross-domain spotlight (20 min) Presenter: _____ Topic: _____
3. Working group reports (20 min) Groups reporting: _____
4. Decision block (15 min) Decisions on table: _____
5. Commitments (5 min) Each member states one action before next meeting.

Tool 3: Measurement Dashboard Template

Metric Baseline Year 1 Year 2 Year 3 Target
Employment rate
Median earnings
Credential attainment
Family Flourishing Index
CTE placement rate
Civic participation score
Employer satisfaction score
Housing stability rate

Tool 4: Spin-Off Readiness Scorecard

Readiness Criterion Not Started In Progress Complete
Board of directors seated
Legal entity formed
Tax-exempt status received
Executive director hired
Diversified funding (40% rule met)
Three-year strategic plan adopted
Independent financial controls
Evaluation contract in place
Brand identity deployed
Data systems transferred
SEED MOU signed

Tool 5: Pilot Design Canvas (Blank)

SEED Pilot Design Canvas
Intervention name  
Gap statement  
Target population  
Geography  
Duration  
Budget  
Funding sources  
Project director  
Evaluation partner  
Comparison method  
Primary outcomes  
Domains served  
Spin-off target  
SEED quality gates Replicable? ___ Measurable? ___ Spinnable? ___ Family-centered? ___
11

References

  1. Kania, J. & Kramer, M. "Collective Impact." Stanford Social Innovation Review, Winter 2011. The foundational argument for cross-sector alignment in community change efforts.
  2. Putnam, R.D. Bowling Alone: The Collapse and Revival of American Community. Simon & Schuster, 2000. Documents the erosion of mediating institutions and its consequences for civic life.
  3. Hernandez, D.J. "Double Jeopardy: How Third-Grade Reading Skills and Poverty Influence High School Graduation." Annie E. Casey Foundation, 2012. Establishes third-grade reading as a leading indicator of educational and economic outcomes.
  4. Butterfoss, F.D. & Kegler, M.C. "The Community Coalition Action Theory." In DiClemente, R.J., et al. (eds.), Emerging Theories in Health Promotion Practice and Research, 2nd ed. Jossey-Bass, 2009. Describes the structural conditions under which multi-sector coalitions produce sustained outcomes.
  5. Schwartz, R.B. & Hoffman, N. "Career Pathways as a Systemic Framework." In Hoffman, N. & Schwartz, R.B. (eds.), Learning for Careers: The Pathways to Prosperity Network. Harvard Education Press, 2017. Documents career-connected education models across the United States and Europe.
  6. Rossi, P.H., Lipsey, M.W. & Henry, G.T. Evaluation: A Systematic Approach. 8th ed. SAGE, 2019. Standard reference for program evaluation methodology, including minimum timeline standards for longitudinal impact studies.
  7. W.K. Kellogg Foundation. Evaluation Handbook. 2004. Establishes the 10% evaluation budget benchmark widely adopted by foundations and government funders.
  8. Smith, T.W. "Trends in Social Trust." General Social Survey Cross-Section and Panel Data. NORC at the University of Chicago, 2019. Provides validated trust measurement instruments used in national longitudinal research.
  9. VanderWeele, T.J. "On the Promotion of Human Flourishing." Proceedings of the National Academy of Sciences, 114(31), 2017. Proposes a multi-domain flourishing framework with composite measurement methodology.
  10. Kim, M. & Mason, D.P. "Revenue Diversification and Nonprofit Financial Health." Nonprofit and Voluntary Sector Quarterly, 49(2), 2020. Provides empirical evidence for the relationship between funding diversification and organizational sustainability.
  11. Belfield, C.R., Levin, H.M. & Rosen, R. "The Economic Value of Opportunity Youth." Corporation for National & Community Service, 2012. Quantifies the fiscal and social costs of youth disconnection from education and employment.
  12. U.S. Department of Labor, Employment and Training Administration. "Job Corps Performance Results, Program Year 2023." 2024. Reports completion rates and placement outcomes for the largest federal youth workforce program.

Communities do not lack good intentions. They lack aligned systems. The gap between aspiration and execution is not filled by more meetings, more grants, or more strategic plans. It is filled by the slow, difficult work of building institutions that connect education to employment, employment to family stability, and family stability to civic renewal.

This toolkit provides the scaffolding. But scaffolding is not the building. The building is what happens when a superintendent calls a chamber president about an apprenticeship slot. When a pastor connects a single mother to the workforce navigator at the community college. When an employer redesigns a job posting because a high school CTE teacher showed them what their students can actually do. Those are the moments where systems change. They do not happen by accident. They happen because someone built the table, shaped the idea, and measured the result.

SEED exists to generate those moments—and then to step back. The institutions this process creates should outlast every organization involved in their founding, including SEED itself. That is not a limitation of the model. It is the point.

Damon Gardenhire

Founder, SEED Alliance & LINCHPIN

March 2026