Last fall at the Army Applications Lab Vertex event, I mentioned that I had created a simple AI Persona with ChatGPT called MARC: the MOSA Assistant for Research Collaboration (which is available for those with this link and an active ChatGPT account). This is a VERY simple GPT customization, with just a little bit of training and some tuning to give a few decent answers to common MOSA questions.
A Sample Chat with MARC

Feel free to love or hate the response, but it is not half bad. The version I am working on in the background for our members is even better. Let me know your thoughts! Here is an example of MARC answering the question:
“I want to understand the difference between conformance to a standard (which is useful at a modular level for a specific key interface) and system level MOSA conformance on an entire platform.”
MARC Explains (direct ChatGPT Output)
The following (between the lines) is direct output from MARC, without any editorial. Below that are some additional thoughts on where we go from here. Update: I did go back and ask MARC to refomat the output as an HTML <div> section to make it easier to copy and paste the output.
Difference Between Modular-Level and System-Level MOSA Conformance
1. Modular Level Conformance (Key Interface Focus)
Definition: Modular conformance applies to specific components or subsystems that adhere to defined open standards, such as the Future Airborne Capability Environment (FACE) Technical Standard or Sensor Open Systems Architecture (SOSA).
Scope: Limited to a single interface or functional module within the larger system.
Purpose: Ensures interoperability and interchangeability at key integration points without redesigning the entire platform.
Examples: A FACE conformant avionics module or a transport service adhering to ARINC 653 standards.
Outcome: The module can be reused across different systems, enabling easier upgrades and cross-platform deployment.
Key Insight: FACE conformance ensures that a control component can interact with different systems that use the same standard.
2. System-Level MOSA Conformance (Whole Platform Approach)
Definition: System-level conformance involves designing and developing an entire platform or weapon system using the MOSA framework, integrating open standards across the full lifecycle.
Scope: Applies to the complete system from “nose to tail,” covering all major components and subsystems.
Purpose: Enhances lifecycle adaptability, improves competition, reduces costs, and enables incremental technology refresh.
Examples: A fighter jet or ground vehicle with open interfaces across avionics, communications, and sensor suites, enabling third-party vendors to develop and integrate solutions.
Outcome: The platform can evolve over time, incorporating new technologies without extensive re-engineering.
Key Insight: The MOSA strategy addresses both business and technical issues, ensuring that all components across the system lifecycle conform to open architectures.
Key Differences:
| Aspect | Modular-Level Conformance | System-Level MOSA Conformance |
|---|---|---|
| Scope | Single module or key interface | Entire system or platform |
| Objective | Interoperability of specific components | Lifecycle adaptability and cost-efficiency |
| Effort Required | Limited to module development | Requires enterprise-wide architectural planning |
| Impact on Lifecycle | Facilitates incremental upgrades | Enables long-term system evolution |
| Compliance Example | FACE Technical Standard conformance | Full system adherence to MOSA principles |
Why This Matters:
Modular conformance is essential for achieving interoperability within specific interfaces, while system-level MOSA drives larger strategic goals, ensuring the whole platform remains adaptable and competitive over time.
Programs often begin with modular conformance to specific standards and gradually scale to system-wide MOSA as part of their long-term acquisition strategy.
Would you like more detail on how MOSA conformance is evaluated or examples of successful implementations?
Where We Go From Here
I am interested in soliciting feedback from the community, now that I have officially launched the MOSA Network website, on the potential for taking my more finely tuned private experiments and making them available to members of the MOSA commuity. Feel free to mess around with MARC and see what halucinations and other issues you can uncover, more importantly what suggestions you would like to see in a real Beta version for MOSA Network community members, and send me the feedback.
First and foremost, this is one of a huge number of tools, so spending time on the specific niche MOSA support tool is only worth the effort if members of the community see value. I certainly do, given that there is so much fog in the basic understanding of MOSA pillars and how to apply them to specific situations. Here is a non-prioritized laundry list of things I am working on in a real Beta version of MARC.
- Ability to handle CUI and work offline (experimenting with Phi-3 and other models that can work disconnected) for classified projects.
- Ability to properly and fully understand the various standards, how they fit (or conflict) when stitched together, and make suggestions on integration.
- Very clear and well formated architecture description documentation for models and code to speed up the review process for things that work on the bench
- Direct support for conformance, particularly when many modules are pulled together for a whole complex system
There is more, but this is a decent summary of what I am working on. What would you like to add to this list?





Leave a Reply