Why the US Navy Needs a Lessons-Learned Center for Shipbuilding

[By Dr. Marcus Jones]
In March 2025 testimony before the House Armed Services Committee’s Seapower and Projection Forces Subcommittee, Ronald O’Rourke, naval analyst for the Congressional Research Service since 1984, sharpened an excellent recommendation he has raised over more than a decade: the U.S. Navy should establish a dedicated institutional mechanism for systematically capturing, analyzing, and transmitting lessons learned from its shipbuilding programs.
Although the U.S. Navy has accumulated an extraordinary body of experience in ship design and construction over more than two centuries, it continues to make avoidable mistakes in major acquisition programs such as proceeding into construction with incomplete designs, integrating immature technologies, projecting unrealistic cost and schedule estimates, and eroding accountability structures once a program becomes politically or industrially “too big to fail.” These errors are not unique to the Navy, but they are particularly consequential in the context of shipbuilding, where program timelines are long, platforms are few and expensive, and consequences are measured in strategic as well as fiscal terms.
O’Rourke’s solution is a “lessons-learned center” for naval shipbuilding: a dedicated, continuous, and institutionalized effort to capture knowledge from past programs, distill it into accessible form, and ensure it informs future design, acquisition, and oversight decisions. The value of such an entity, he argues, would lie in its ability to prevent repeated mistakes, reduce waste, improve program outcomes, and help sustain the Navy’s long-term force design and industrial base goals. It addresses key features of the Navy’s acquisition environment: the discontinuous and generational nature of major shipbuilding programs; the structural fragmentation of knowledge across commands, contractors, and government agencies; and the absence of an educational or doctrinal home for critical institutional memory.
Unlike weapons or aircraft programs, which may see dozens or hundreds of iterations within a single career, major ship classes are often designed and constructed once every 20 or 30 years. The effect of this long cycle time is that most individuals involved in a new class of ships – whether program managers, naval architects, flag officers, or congressional staffers – may have had no direct role in the last one. What should be institutional memory therefore becomes diffuse personal recollection, vulnerable to retirement, reassignment, or obsolescence. Moreover, the knowledge necessary to understand past program outcomes is distributed across a complex web of organizations: Program Executive Offices, NAVSEA and its affiliated labs and centers, shipyards and primes and sub-tier contractors, OPNAV resource sponsors, the Office of the Secretary of Defense, and various congressional committees and watchdogs. Each retains only partial and often incompatible records, and there is little incentive or mechanism for aggregating these into a unified analytic understanding. While internal program reviews, GAO reports, and RAND studies may document lessons after the fact, there has never been an entity within the Navy tasked with curating, synthesizing, or teaching these insights.
Interestingly, O’Rourke does not propose a narrowly bureaucratic mechanism but envisions a range of possible instantiations, from a structured repository of documents to a more active, curriculum- and wargame-integrated enterprise. But what matters in his framing is not form but function: the institutionalization of a reflective capacity for learning from experience and applying that learning prospectively in ways that materially improve outcomes.
Such a capability, if properly implemented, would amount to a kind of strategic memory for the Navy, one able to withstand changes in leadership, budget, and political context, while enabling the service to treat shipbuilding not as a sequence of isolated procurements but as a continuous and evolving system of practice. It is not, therefore, a technocratic fix for acquisition inefficiencies, but a cultural transformation within the Navy’s approach to its own history of design, development, and production. It holds out the prospect that the Navy would not only save money and avoid failure, but reaffirm its preferred identity as a thinking, adaptive, and strategically serious organization. It is this deeper institutional value – far beyond process improvement – that makes O’Rourke’s proposal for a naval shipbuilding lessons-learned center important and long overdue.
Joint Lessons on Lessons Learned
The idea has modest precedent and ample justification. One of the most robust models of institutional learning in the defense sector is the U.S. Army’s Center for Army Lessons Learned (CALL), established in 1985 in response to the operational shortcomings revealed during Operation Urgent Fury in Grenada. CALL’s mission was to systematically collect, analyze, and disseminate operational and tactical lessons. Over time, it became fully integrated into Army doctrine and planning, fielding collection teams, producing analytic bulletins, and shaping professional military education. But of particular relevance to the Navy’s shipbuilding enterprise is a less widely known but equally instructive initiative: the Center for Army Acquisition Lessons Learned (CAALL), housed within the Army Materiel Systems Analysis Activity.
Established following the 2010 Army Acquisition Review, which cited the absence of a centralized mechanism for analyzing acquisition successes and failures, CAALL provides an authoritative source for acquisition-specific lessons across the Army’s program offices. It operates a web-enabled Acquisition Lessons Learned Portal (ALLP) through which project teams submit concise, structured, and searchable lessons, each tagged by acquisition phase, milestone, cost and schedule impact, and functional category.
These are not vague observations, but distilled from real program experience and embedded in metadata-rich formats that support both searchability and trend analysis. CAALL analysts conduct deep-dive studies of recurring issues, such as documentation burden, Earned Value Management failures, or test duplication, and prepare “just-in-time” lesson packages for project managers entering specific acquisition phases. The Center also engages in outreach, publishes bulletins, curates spotlight topic zones, and supports internal Army decision-making with synthesized data on the top five systemic challenges facing Army programs. It demonstrates that institutional learning is within reach but requires structured data, a deliberate submission pipeline, professional analytical support, and educational integration. It also shows how lessons can be transformed from static reflections into dynamic inputs for decision support, policy revision, and curriculum development. Most importantly, CAALL demonstrates that such a capability can be sustained over time, through leadership endorsement, modest staffing, and the aggressive use of digital tools.
A shipbuilding-focused counterpart – scaled appropriately to the Navy’s size, resourced modestly, and empowered to draw insight from both current and historical programs – would not need to reinvent the wheel. It would only need to learn how others have made their institutions learn.
Other models further underscore the feasibility and necessity of such a capability. The Joint Lessons Learned Program (JLLP) applies a five-phase process – discovery, validation, resolution, evaluation, and dissemination – to lessons arising from joint exercises, operations, and experiments. Its information system, JLLIS, acts as a system of record for tracking, archiving, and analyzing lessons that affect force development and joint capability planning.
A more technical and directly relevant precedent is found in NASA’s Lessons Learned Information System (LLIS).4 NASA’s LLIS arose from the hard-won awareness, following the Challenger and Columbia disasters, that high-stakes engineering efforts demand not only risk management tools but a durable culture of reflection and improvement. NASA’s system integrates lessons into program planning and design reviews and allows for long-term traceability of decisions and failures. The agency’s approach, emphasizing root cause analysis, organizational memory, and education, aligns with the intended mission of an NSLLC to translate the history of naval shipbuilding experience into anticipatory guidance for future programs. Like NASA, the Navy deals with one-off, bespoke, high-cost platforms with life cycles spanning decades. The discipline required to learn systematically from such endeavors is the same.
Even in the commercial sector, complex system integrators such as Boeing, Airbus, and multinational energy firms have turned to lessons-learned systems, both formal and ad hoc, to analyze catastrophic failures and to course-correct future programs. The Construction Industry Institute’s lessons-learned repositories, used by engineering and construction firms to improve execution of large-scale infrastructure projects, is still another model for post-project analysis and feedback. These efforts are often grounded in shared technical taxonomies, design decision trees, and “causal maps” that allow construction organizations to relate performance outcomes to earlier architectural or managerial choices. The Navy’s shipbuilding community, which is distinguished by even greater system and technological complexities and similar exposure to path-dependent design choices, lacks such a coherent and systematized mechanism. An NSLLC would hold out the promise of that capability.
Of course, these precedents cannot simply be imitated wholesale, but they offer essential lessons in form, function, and value. Each succeeds not by relying on passive documentation and informal processes, but by embedding structured learning into the decision cycles and professional cultures of their organizations. What an NSLLC must do is adapt this logic to the particularities of U.S. naval shipbuilding: its long timelines, institutional fragmentation, industrial dependencies, and strategic visibility. It must provide an analytic and educational platform that helps naval leaders and engineers reason more effectively about cost, capability, risk, and design. It must produce continuity across ship classes and across generations of acquisition professionals. And it must do so not as a retrospective archive alone, but as a living resource embedded in professional education, program governance, and future planning.
Over the past several decades, the U.S. Navy has been the subject of repeated and increasingly urgent calls to establish a formal mechanism for doing just that, all of which have, time and again, failed to take root. While the service has often acknowledged the recurrence of major programmatic mistakes – most notably in high-profile acquisition efforts such as the Littoral Combat Ship, the Zumwalt-class destroyer, and the Ford-class aircraft carrier – it has not developed a durable, institutionalized capacity for engineering and acquisition-oriented organizational learning. This failure has not gone unremarked. A lineage of initiatives, proposals, and critiques – some internal, some external, some aspirational, others postmortem – has identified the absence of such a capacity as a root contributor to the Navy’s persistent shipbuilding troubles.
Perhaps the most compelling of these efforts is a 2022 MIT thesis by naval engineer Elliot Collins, which deserves attention not only for its technical sophistication but for its diagnosis of a deep institutional shortcoming. Collins, a Navy officer serving in the DDG(X) design program, observed firsthand what he describes as a structural absence of organizational memory in Navy ship design and acquisition. His thesis, written under the auspices of MIT’s Naval Construction and Engineering program, proposes the creation of a Navy Design Notebook System (NDNS): a digital, structured, and lifecycle-aware framework for recording and organizing design decisions, assumptions, lessons, and engineering rationale across a ship’s development. Drawing inspiration from both Toyota’s engineering notebook practice and the best traditions of systems engineering, Collins lays out a clear taxonomy and architecture for capturing knowledge in real time and rendering it useful across multiple programs and decades. Crucially, the NDNS is not just a data storage concept, but a model for how design reasoning can be institutionalized so that the lessons of one generation are accessible and intelligible to the next.
The significance of Collins’s proposal lies in the lineage of failed or underdeveloped efforts that it implicitly seeks to redeem. As far back as the 1970s, the Navy undertook an informal initiative known as the REEF POINTS series, pamphlet-style reflections on acquisition experience intended to help incoming program officers. But the REEF POINTS effort lacked formal backing, consistent authorship, or archival permanence, and it quickly faded as personnel rotated out and no office assumed responsibility for sustaining it. Later assessments, including a 1993 Department of Defense Inspector General report, found that the Navy lacked a centralized system for capturing acquisition lessons learned, and more critically, that it made little practical use of the systems it did possess. Data were gathered, but not applied; observations made, but not preserved; patterns noted, but not internalized. The diagnosis repeated itself in a 2002 analytical review commissioned by the Army’s War College, which found that across the Department of Defense, lessons-learned programs often failed not for lack of insight but for lack of organizational stewardship, cultural support, and procedural integration.
Why, then, despite these longstanding recognitions, has the Navy failed to institutionalize a lasting lessons-learned capability in its shipbuilding enterprise? The reasons are multiple and reflect a misalignment between the operational culture of the Navy and the administrative and engineering demands of ship design. Unlike the tactical communities of naval aviation or undersea warfare – where debriefing, checklist revisions, and iterative training are ingrained – the acquisition enterprise lacks a comparable feedback loop. Moreover, the Navy’s engineering education pathways, from undergraduate technical training to postgraduate systems curricula, have not systematically incorporated acquisition case studies or design failures into their pedagogy. There is no consistent mechanism to bring shipbuilding experience into the classroom, the wargame, or the design studio. Lessons remain tacit, siloed, and anecdotal.
That the Navy has lacked such a capacity for so long is a failure of imagination and institutional design, but it not an irremediable one. The architecture of such a capability already exists in other domains, from NASA to the Army to the commercial nuclear sector. The Navy does not need to invent a solution from whole cloth; it needs to adapt proven models to its own technical and cultural context. What is required is not another ad hoc study or retrospective review, but the establishment of a permanent Naval Shipbuilding Lessons-Learned Center, a durable institutional home where technical memory, engineering reasoning, and acquisition insight can be collected, structured, and applied. The central question, then, is not whether such a center is needed, but what it should consist of, how it should function, and where it should reside.
The Devil in the Details
To be more than a bureaucratic corrective or another forgotten archive, a shipbuilding lessons-learned program must fulfill a set of core functions as intellectually rigorous as the failures it seeks to prevent and not just catalog what has gone wrong in previous programs or indulge in generalities about process improvement. The first and most essential function is to identify and preserve actual lessons: not loose observations or platitudes, but knowledge with clear causal content, derived from real program experience, and supported by traceable evidence.
To qualify as such, a lesson must demonstrate causal specificity: what precisely caused the outcome it describes, and why. It must be replicable or at least transferable across contexts, suggesting how it might inform other ship types or acquisition models. It must be traceable to primary sources – engineering drawings, test data, milestone reviews – so that its logic can be reconstructed and its authority verified. It must be actionable, capable of informing future decisions, whether at the level of design margin, contract structure, or policy architecture. And ideally, it should possess counterfactual depth: the ability to show not only what happened, but what might have happened differently under other choices.
When filtered through this lens, the lessons that matter and that a center must preserve fall broadly into five categories. First are design integration lessons, insights into how complex systems interact within the hull, and how early design assumptions or immature technologies can generate cascading failures, as in the DDG-1000’s power system or the Ford-class’s EMALS launch mechanism. Second are construction and manufacturing lessons, which speak to the translation of design into physical product: the timing of block assembly, the thresholds at which digital coordination outperforms paper-based workflows, the effects of workforce experience on productivity. Third are program management and acquisition lessons (perhaps the most politically fraught) concerning contract type selection, milestone pacing, and the dangers of concurrency. Fourth are industrial base and supply chain lessons, which trace how changes in the broader defense industrial ecosystem—supplier attrition, workforce bottlenecks, fragility in the materials base—constrain program execution in ways the Navy and its private shipbuilders often fail to anticipate. And finally, there are historical, strategic, and doctrinal lessons, which reveal how misalignments between strategic ambition and industrial reality (fleet design concepts that outpace build capacity, for instance) can derail even well-managed programs.
Still, it is not enough just to identify them; lessons must be preserved and organized within a structure that allows them to be used. Here, the Navy can draw on models such as that proposed by Collins in his thesis: a digital, lifecycle-aware knowledge framework that tags and stores design decisions, assumptions, and lessons in a manner that makes them accessible not only to current program staff but to future generations. Such a system would form the backbone of the NSLLC’s information architecture: structured, searchable, phase-referenced, and durable. It would allow engineers working on SSN(X) to understand not just that the Virginia-class succeeded or stumbled in certain areas, but why, under what constraints, and according to which tradeoffs. It would enable program sponsors to distinguish between lessons that were context-specific and those that reflect deeper structural patterns.
Ultimately, the most critical function of the NSLLC, however, is not archival but pedagogical. Lessons, to be meaningful, must be taught as part of a living curriculum, and not simply as dry memoranda or summary slides. The center must work directly with educational institutions to embed lessons into the professional formation of officers, policy officials, engineers, and acquisition professionals. This means developing decision-forcing cases that place students in the shoes of historical program leaders, confronting them with the actual dilemmas and constraints those leaders faced. It means designing wargames and exercises that test tradeoffs in acquisition, industrial surge, and fleet composition. It means seeding capstone projects, research initiatives, and faculty development efforts with questions drawn from real program history. And it means, above all, creating a culture in which experience is not simply remembered but used as a guide to reasoning, as a check against institutional hubris or forgetfulness, and as a source of comparative advantage in a strategic environment where time and resources are finite. Finally, the Center must function diagnostically on behalf of Navy decision-makers, as a resource for the review of future program plans, bringing to bear its corpus of structured knowledge to identify early warning signs of known failure modes, or to highlight opportunities for constructive borrowing across ship classes. This is not a matter of punitive oversight, but of anticipatory guidance and bringing past reasoning to bear on present decisions in a way that deepens accountability and reduces risk.
What this amounts to is a knowledge institution, not in the narrow academic sense but in the most operationally vital sense of the term. The NSLLC would exist to ensure that the U.S. Navy no longer builds its ships without memory. It would translate past pain into future prudence, and costly failure into usable foresight. And it would mark, at last, the point at which naval shipbuilding began to behave not just as a procurement function, but as a learning system worthy of the stakes it bears.
The Way Ahead
What would such a center look like in practice? If the value of a Naval Shipbuilding Lessons-Learned Center lies in the integrity and usability of its knowledge, then its organizational structure must be equally deliberate. It should not replicate the diffuse and stovepiped landscape of existing program oversight offices, but rather bridge engineering, acquisition, policy, and education communities. And in keeping with the realities of today’s defense fiscal environment, it must be lean, digitally enabled, and architected from the start to minimize overhead. The NSLLC should be organized as a small, hybrid analytical and educational unit with as small a group of affiliated personnel as circumstances permit, including naval engineers with experience in major design and production programs; acquisition professionals familiar with contracting and program management dynamics; historians of technology and naval policy who can trace institutional lineages and doctrinal consequences; and digital knowledge architects to manage its structured repository and analytic tools. Core activities would be augmented by short-term fellows – rotating billets for officers, civilians, or academics on sabbatical or detail – who would conduct targeted case studies, contribute to curriculum development, or lead diagnostic reviews of current programs. Rather than attempt to recreate or replace existing program data flows, the Center should connect to them and draw from NAVSEA, PEO Ships and Submarines, CRS, GAO, and DoD IG reports, but synthesize across them with the purpose of creating pedagogically and analytically coherent insights.
To reduce cost and footprint, the Center must leverage digital tools aggressively. A cloud-based digital architecture, modeled in part on the NDNS framework, would form the heart of the operation: a searchable, metadata-tagged, phase-referenced archive of lessons that supports analysis, instruction, and red-teaming of future programs. Visualization tools like interactive timelines, decision trees, and traceability matrices should be prioritized over staff-intensive publishing or editorial operations. Whenever possible, the Center’s materials should be reusable across formats: a single case study might underpin a midshipman seminar, an acquisition wargame, and a policy memo to ASN(RDA). In this sense, the Center is less a physical institute than a virtual and modular capability: one that enables reflection, instruction, and anticipatory decision support wherever shipbuilding is debated or taught.
As to its location, the author will admit to a conflict-of-interest, being a longtime member of the U.S. Naval Academy faculty. It may, therefore, sound parochial to suggest that the NSLLC be housed at Annapolis. That said, there are good reasons, symbolic and practical, why the Naval Academy may be a fitting institutional home. The Academy is the Navy’s enduring schoolhouse, the place where generations of officers are introduced not just to the fleet, but to the long arc of naval experience. It offers a rare confluence of technical education, historical reflection, and leadership formation.
Moreover, it sits proximate to the Washington-area institutions with which the NSLLC would regularly interact – NAVSEA, the Navy labs and warfare centers, OPNAV and the Secretariat organization, and the various acquisition and oversight bodies headquartered in the capital region. Perhaps most importantly, the Academy is a place not just of training, but of memory. To locate the Center there would signal that lessons are not just compliance artifacts or after-action musings, but a core component of professional identity. It would allow the Center’s work to be integrated directly into engineering coursework, capstone design, fleet seminars, and acquisition electives. And it would give midshipmen, from the beginning of their careers, access to a body of knowledge that has existed until now only in fragments.
But what matters is not the administrative chain but the Center’s function: to make memory usable, to make learning permanent, and to help the Navy move from a culture of crisis improvisation to one of cumulative, adaptive competence. Wherever it is housed, a Naval Shipbuilding Lessons-Learned Center should embody the values it seeks to cultivate: frugality, clarity, and strategic discipline. And in doing so, it may just help the Navy build not only better ships, but a better institution.
Dr. Marcus Jones is an associate professor in the history department at the United States Naval Academy.
This article appears courtesy of CIMSEC, and may be found in its original form here.
The opinions expressed herein are the author's and not necessarily those of The Maritime Executive.