#legacy-systems — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #legacy-systems, aggregated by home.social.
-
3 Minutes of Downtime. Full Microsegmentation. No Changes Required.
https://youtu.be/OOQYMXJi-0M #cybersecurity #HardwareMicrosegmentation #Manufacturing #LegacySystems #SASE #FIPS140-2 #ICS #OT #IoT #RiskManagement -
Closed source wasn’t always a strategy.
It was the default.Teams operated with limited visibility.
Control felt efficient — but slowed collective learning.From The Source Code Spectrum 🧠
🔗 https://www.softwareantifragility.com/p/the-source-code-spectrum -
https://winbuzzer.com/2026/01/22/security-risk-many-atms-are-still-running-windows-7-xcxwbn/
Security Risk: Many ATMs are Still Running Windows 7
#Windows7 #Cybersecurity #ATMs #Microsoft #Security #Fraud #Cybercrime #LegacySystems #Banking #BigTech
-
What Does a Good Spec File Look Like?
Most legacy government systems exist in a state of profound documentation poverty. The knowledge lives in the heads of retiring employees, in COBOL comments from 1987, in binders that may or may not reflect current behavior. Against this baseline, the question of what makes a “good” spec file takes on different dimensions than it might in greenfield development.
Common Elements
Any spec worth writing answers the fundamental question: what are we building and why? Beyond that, good specs share a few specific characteristics:
Clear success criteria. Not just features, but how you’ll know the thing works. This matters especially when AI agents are generating implementations—they need something concrete to validate against.
Constraints and boundaries. What’s out of scope. What technologies or patterns to use or avoid. Performance requirements. AI tools are prone to scope creep and assumption-making without explicit boundaries.
Examples of expected behavior. Concrete inputs and outputs, edge cases, error states. These serve as both specification and implicit test cases.
Context about the broader system. How this piece fits into what exists. AI assistants lack awareness of surrounding code and architectural decisions unless you tell them.
The SpecOps Context
When modernizing legacy government systems, specs serve a different purpose than typical development documentation. They’re not just implementation guides—they are artifacts that preserve institutional knowledge. This changes what “good” looks like.
A SpecOps specification document must work for multiple audiences simultaneously: domain experts who verify that the spec captures policy intent, software developers and AI coding agents who need precision to generate correct implementations, and future humans who need to understand why the system behaves a certain way years from now—possibly after everyone currently involved has moved on.
That last audience is the one most spec formats neglect entirely.
Three States, Not One
Legacy system specs can’t just describe “what the system does.” They need to distinguish between:
- Current system behavior—what the legacy code actually does today, bugs and all
- Current policy requirements—what the system should do according to governing statutes and regulations
- Technical constraints—what the system cannot do regardless of policy, due to missing integrations or platform limitations
These three things can be in alignment or tension at any moment. And that alignment can shift over time without the code changing—a policy update tomorrow can transform compliant behavior into a violation.
Known Deviation Patterns
Consider the example of a benefits system that should verify income against a state tax agency records, but the legacy system only captures self-reported income because the integration with the tax agency was never built. A good spec would make this explicit:
Policy requirement: Per [directive], applicant income must be verified against tax agency records prior to benefit approval.
Current implementation: Self-reported income only. Applicant provides income information on Form X.
Deviation reason: No interface to tax agency income verification service exists. Integration requested in 2019, not funded.
Modernization note: Modern implementation should include tax agency income verification integration.
This surfaces the gap, documents why it exists, and gives the modernization effort clear direction—without pretending the legacy system does something it doesn’t.
Explicit Ambiguity as a Feature
There’s something that seems almost radical about a methodology that says write down what you don’t know. Traditional documentation can project false confidence. It often describes how things should work and quietly omits the messy parts.
A spec that explicitly marks areas of tension or uncertainty is more honest, more useful for risk assessment, and a better starting point for modernization. It’s an invitation for future clarification rather than a false endpoint.
A spec with unresolved tension is better than no reviewable documentation at all.
Policy Grounding
Government system specs need explicit links to authorizing statutes, regulations, or directives. Not just “these items are excluded from income calculations” but “per 42 USC § 1382a, the following items are excluded from income calculations”
This is the why that survives personnel turnover. It’s what allows future teams to evaluate whether behavior that was correct five years ago still aligns with current policy.
Decision Records
When domain experts verify a spec, they make judgment calls—especially where legacy behavior diverges from current policy understanding. Those decisions need to be captured in the spec, not in a separate document that gets lost.
The spec becomes the repository of institutional reasoning, not just institutional behavior.
Accessible or Precise?
The SpecOps approach says that specs should be “readable by domain experts while detailed enough to guide implementation.” This is genuinely hard.
Options include stratified specs (plain-language summaries with expandable technical detail), executable specs (written as tests that are simultaneously human-readable and machine-verifiable), or annotated specs (a single verbose document where technical precision is explained inline).
Given that the spec is meant to be the source of truth that outlasts implementations, keeping everything in one artifact—even at the cost of verbosity—reduces the risk of layers drifting apart over time.
The Road Ahead
We’re still in early days. Questions remain open:
- How granular should policy references be?
- What’s the right way to represent known deviations?
- How should specs age—versioning, or is git history enough?
- What level of detail helps AI agents versus adding noise?
These will get answered empirically as more agencies adopt the approach. The methodology will evolve. The important thing is to start—to surface questions that were previously invisible, to give future teams something to interrogate rather than nothing at all.
Because the knowledge is what matters. Everything else is implementation details.
#ai #artificialIntelligence #chatgpt #governmentServices #legacySystems #systemModernization
-
Modern networks weren’t built for the devices we still depend on.
https://youtu.be/Aq0Ja03Q7Wo #cybersecurity #hardware #hardened #FIPS140-2 #ZeroTrust #Defense #Industrial #Manufacturing #LegacySystems -
Proving Out a New Approach to Legacy System Modernization
Government legacy systems hold decades of institutional knowledge – eligibility rules, policy interpretations, edge cases learned the hard way. When agencies modernize these systems, the typical approach is to translate old software code into new software code. But this typical approach misses something fundamental – the knowledge embedded in these legacy systems is more valuable than the code itself.
SpecOps is a methodology I’ve been developing that flips the typical approach to legacy system modernization. Instead of using AI tools to convert, say, COBOL code into Java code, SpecOps uses AI to extract institutional knowledge from legacy code into plain-language specifications that domain experts can actually verify. The specification becomes the source of truth and guide spec-driven development of modern systems – update the spec first, then use the spec to update the code.
One way to thin about it is like GitOps for system behavior – version-controlled specifications govern all implementations, creating an audit trail and enabling proper oversight of changes.
Testing the approach with IRS Direct File
To try and flesh this approach out more fully, I built a demonstration using the IRS Direct File project – the free tax filing system that launched in 2024, and which is available on GitHub. It’s not “legacy” per se, but it is an ideal test case for several reasons – it has complex business logic interpreting the Internal Revenue Code, a multi-language codebase (TypeScript, Scala, Java), and implements a set of rules that tax policy experts can verify.
To support this demo, I created a reusable set of AI instructions (i.e., “skills” files) for analyzing tax system code:
- Tax Logic Comprehension — the foundation skill for understanding IRC references and tax calculations
- Standard Deduction Calculation — extracting standard vs. itemized deduction logic
- Dependent Qualification Rules — capturing the tests for qualifying children and relatives
- Scala Fact Graph Analysis — understanding the declarative knowledge graph structures
To run the demo, I pointed three different AI models (GPT-5, Gemini 2.5 Pro, and Claude Sonnet 4.5) at actual code samples from Direct File GitHub repo and asked them to generate specifications.
Results
Here are the results of my first attempt at running this demo.
Analysis of the generated software spec files showing the model used and the evaluation grade for eachAll three models successfully extracted business logic into plain language suitable for domain expert review. A tax policy analyst could look at the generated specs and say “yes, that’s correct” or “no, you’re missing the residency requirement” – this is something they could probably not do (certainly not as easily) staring at raw software code.
Notably, these results came from single prompts without iteration. The skills I put together worked across different AI vendors, demonstrating the portability of the SpecOps approach.
Why this is important for government agencies
An important point that I want to emphasize about the SpecOps approach is that if can be used if there are immediate plans for a legacy system modernization, or if one is still several years out. SpecOps is designed to help aggregate and document knowledge about important government systems – there’s never a bad time to do that work.
Agencies can begin extracting specifications from legacy systems today, while institutional knowledge still exists and subject matter experts are still available. When modernization eventually happens – whether in two years or ten – agencies will have:
- Verified documentation for how systems actually behave
- Durable, version-controlled specifications that outlast any particular technology stack
- A foundation that makes future modernization faster, less risky, and less expensive
The alternative is waiting until an agency is forced to modernize, scrambling to reverse-engineer systems after the people who understood them have potentially retired.
Areas for further exploration
This demo also opens several questions worth investigating further:
- Verification at scale: Can policy experts efficiently review AI-generated specs? Initial feedback suggests yes, but more testing is definitely needed.
- Failure modes: The relatively low grade for Dependent Qualification indicates some room for improvement – what can be improved the generate a more highly rated system spec? A different model? A refined skill file? A better prompt (or prompts)?
- Skill refinement: The demo seemed to work pretty well on the first attempt. How much different can the resulting spec files be with iterative prompting?
The demo repository for this effort is public and designed for replication. I’d welcome others testing this approach with different AI models, different code samples, or different domains entirely. I hope others become as excited about the potential for this approach as I am.
Get involved
- SpecOps Methodology: https://spec-ops.ai
- Demo Repository: https://github.com/mheadd/spec-ops-demo
- SpecOps Discussion: https://github.com/mheadd/spec-ops/discussions
If you work in government technology, tax policy, or legacy modernization, I’d especially value your perspective on whether the generated specifications seem genuinely reviewable by domain experts. That’s the core claim that makes SpecOps viable.
The code for all government systems will eventually be replaced. The important question for those of us that work on and with those system is whether the knowledge of how they are supposed to work survives that transition.
#ai #artificialIntelligence #chatgpt #government #legacySystems #llm #technology
-
Who knew AOL still had 30M monthly active users? Bending Spoons did! They're acquiring the platform, banking on its data to fuel AI personalization and efficiency. Turns out, 'legacy' can be profitable fuel for innovation, but integrating it? That's the real challenge.
What old tech do you think still has untapped potential?
#AINews #TechAcquisition #BigData #LegacySystems #AI
https://www.artificialintelligence-news.com/news/bending-spoons-acquisition-of-aol-shows-the-value-of_legacy_platforms/ -
"COBOL supports close to 90% of Fortune 500 business systems today."
https://cobolcowboys.com/cobol-today/
#HackerNews #COBOL #Fortune500 #BusinessSystems #TechHistory #LegacySystems
-
🚀 From 68k ROM to bootable OS in 72 hours
📄 Working paper: https://zenodo.org/records/17196870
💻 GitHub: github.com/Kelsidavis/System7
What legacy system would you want to see preserved or modernized with this approach?
#ReverseEngineering #AI #LegacySystems #SoftwareArchaeology #TechInnovation #ComputerHistory #OpenScience -
Thoughtworks successfully harnessed #GenerativeAI to decode legacy systems without source code!
Using Gemini 2.5 Pro, they accelerated reverse engineering, creating validated “blueprints” of functionality in just two weeks.
💡 InfoQ spoke with authors Thiyagu Palanisamy & Chandirasekar Thiagarajan to learn more about the setup of the pilot, and their reflections on the technique's potential.
📖 Read more: https://bit.ly/3K8ECII
-
Why Hardware Wins Against Software in the Real World of Microsegmentation. An Interview with BYOS CEO Matias Katz
https://youtu.be/VjYqisWGMf4 #CyberSecurity #Microsegmentation #LegacySystems #ZeroTrust #NetworkSecurity #FIPS140-2 #IoT #OTSecurity #CriticalInfrastructure -
Growing complexity means legacy security systems miss one in every 14 threats #CyberSecurity #LegacySystems
-
https://www.powermag.com/securing-the-power-grid-cybersecurity-strategies-for-smart-energy-systems/
"dispersed nature of #smartgrids, with thousands of interconnected devices like #smartmeters, #sensors, and distributed energy resources #DER's, creates a vast attack surface. For #substations, many operate #legacysystems (such as supervisory control and data acquisition #SCADA, which are essential for monitoring and control, and are prime targets."
-
The Quiet Crisis in Legacy System Modernization
Government agencies have started experimenting with AI—particularly large language models (LLMs)—to accelerate the long-standing problem of modernizing legacy systems. A recent MITRE analysis, Legacy IT Modernization with AI, shows early promise. LLMs can be used to extract logic from old codebases and generate “intermediate representations” that help teams refactor or rewrite aging systems. It’s not a perfect solution, and it still requires human oversight, but it’s a serious step forward.
So far, the conversation on AI-assisted legacy modernization has centered on large, mission-critical federal systems—mainframe applications that support tax processing, logistics, or entitlement programs. But this focus overlooks a vast and growing problem: the thousands of small, back-office systems that keep state and local governments running. These applications don’t often make headlines, but they quietly power licensing, payroll, casework, and many other daily operations.
Many of these systems are written in obscure, decades-old languages (think MS Access). Documentation is sparse or nonexistent. The people who built and maintained them are retiring. And the government’s ability to recruit and retain technical staff has not kept pace with demand. What’s more, the sheer number of these systems—and the institutional knowledge they depend on—makes traditional modernization approaches slow and expensive.
The MITRE report provides a useful proof point: AI can help accelerate modernization. But that benefit needs to reach beyond a few flagship systems. If modernization efforts stay focused only at the federal level or only on the biggest programs, governments at every level will be stuck maintaining outdated software with dwindling staff and rising risk.
To meet this challenge, governments needs a broader approach. That means funding, staffing, and supporting modernization efforts that include every level of government—not just those at the federal level. It means experimenting with AI-assisted refactoring tools on a wider range of systems. And it means ensuring that institutional knowledge doesn’t retire out of reach before the code is made maintainable again.
AI won’t solve legacy modernization on its own. But it’s the first tool in a long time that changes the speed and scale of what’s possible. We should use it—everywhere we can.
#AI #artificialIntelligence #ChatGPT #governmentServices #legacySystems #llm #systemModernization #technology
-
FAA to retire floppy disks and Windows 95 amid air traffic control overhaul - On Wednesday, acting FAA Administrator Chris Rocheleau told ... - https://arstechnica.com/information-technology/2025/06/faa-to-retire-floppy-disks-and-windows-95-amid-air-traffic-control-overhaul/ #governmenttechnology #airtrafficcontrol #vintagecomputing #chrisrocheleau #infrastructure #transportation #legacysystems #modernization #usgovernment #floppydisks #retrotech #seanduffy #windows95 #aviation
-
Good read on the computer systems at the Social Security Administration, and why DOGE is going to fail to modernize them. Written by people who know how to do this kind of work, and who understand the SSA.
#LegacySystems #COBOL #SSA #doge
https://www.wethebuilders.org/posts/what-it-really-takes-to-migrate-cobol
-
🧯 What Could Go Wrong? DOGE to Rapidly Rewrite SSA's COBOL Codebase
DOGE, a Musk-aligned “efficiency” department, plans to migrate the Social Security Administration’s COBOL-based systems to modern code like Java — in just a few months.
💬 Experts are raising serious alarms:
🔹 SSA systems serve 65M+ Americans and haven’t been overhauled since the 1980s.
🔹 COBOL is deeply embedded in core logic: SSNs, benefit payments, entitlements.
🔹 A proper migration should take years, not months.
🔹 The team pushing this effort reportedly includes inexperienced engineers who can’t read COBOL or navigate legacy mainframe architecture.
🔹 DOGE may use generative AI for code translation — without fully grasping edge cases or dependencies.SSA insiders describe the environment as a “house of cards” — where even minor changes can break everything.
⚠️ Tech modernization must be grounded in humility, expertise, and realism — not political theater and wishful thinking.
#GovTech #SSA #DigitalTransformation #LegacySystems #COBOL #CyberSecurity #MainframeModernization #PublicSectorIT
-
⚠️ DOGE’s Rushed Plan to Rebuild SSA Systems Will Be a Disaster - FFS 🤦🏻♂️
WIRED reports that the Department of Government Efficiency (DOGE) is pushing to rewrite the Social Security Administration’s COBOL-based systems in a matter of months — a job that experts agree should take years.
🧠 What’s at stake:
🔹 SSA supports 65+ million Americans — stability is non-negotiable.
🔹 COBOL runs the core logic for Social Security numbers, payments, and entitlements.
🔹 Even small changes could lead to silent system failures or missed benefits.
🔹 DOGE’s team — reportedly made up of young, inexperienced hires — lacks the skill to read COBOL or grasp mainframe architecture.
🔹 Without understanding the legacy code, there’s no safe path to rewriting it.This is not just a tech migration — it’s a potential house-of-cards collapse. Critical national infrastructure cannot be rewritten with duct tape and hubris. 🧯
#GovTech #SSA #LegacySystems #COBOL #DigitalTransformation #Mainframes #PublicSectorIT #software #programming
https://www.wired.com/story/doge-rebuild-social-security-administration-cobol-benefits/