Connecting Technical Training Data to the Digital Thread

Customer Insight: Navy Acquisition Requirements for Training Transformation (ARTT) Project

Engineering data living in a Project Lifecycle Management (PLM) system is the foundation on which systems are built. It follows that the same data should be leveraged when building training requirements, curriculum, and immersive learning environments for those systems.

The engineering data is subject to constant change driven by rapid technology growth and mission requirements. Historically, technical training data has always existed in disconnected environments, separate from all this key product information. This means that the versions of systems taught in the classroom lag behind the most current versions of equipment in the field.

The Navy’s Acquisition Requirements for Training Transformation (ARTT) project is taking military training data management to the next level, and GPSL is part of the expert team of vendors working on this exciting project.

The project integrates multiple data standards and systems to create a digital thread from the engineering source data down to the training artifacts delivered in the classroom. In that thread, the maintenance task analysis is used as the authoritative source for learning objectives, curriculum, and technical manuals.

Connecting and coordinating so many disparate data sources has presented GPSL with unique challenges and our team have used their collective experience and knowledge to overcome these.

We are incredibly proud to be involved in this project, but why is it so important, and what does the future look like for the project?

We caught up with Wayne Gafford, ARTT Project Manager and a Navy Program Analyst, to find out…

The History

ARTT uses Natural Language Processing (NLP) to generate technical learning objectives from maintenance task analyses. What is the history behind this approach?

In the early 2000's, in what was then the Navy's integrated learning environment, we were looking at how technical manual requirements connected to training requirements.

Move forward 20 years, and we have developed artificial intelligence in the form of natural language processing software that can read maintenance task analysis properties, pull out key pieces of data, and reformulate it into a Navy-compliant learning objective.

So, from initially asking “what are the technical data requirements for training”? we have progressed to create capabilities that directly link training to the authoritative engineering data source upstream for the entire product lifecycle. ARTT also uses the maintenance task analysis data to prioritize the tasks to be trained, then applies learning levels to the objectives.

Creating the training Link

How does the training material remain linked to the upstream product engineering data?

When you derive technical learning objectives directly from the maintenance task analysis, you create a digital thread that allows for configuration management.

This means that a particular learning objective and the corresponding course are forever linked through product component identifiers upstream to authoritative source information, such as a list of parts or an engineering drawing.

We manage technical training assets like a part in a warehouse: we know what we have, where it’s located, and where it needs to be delivered.

The Benefits

What is the benefit of linking training to the source product engineering data in this way?

Having the technical learning content managed and authored in the very same space as all of the other product information generates accuracy and timeliness to your learning content and ensures that the information used by the fleet for training is the latest and the greatest.

The significance

How big a change is this in terms of the evolution of training content?

This is not just a technology change, it is a huge cultural and organizational change too.

The digital thread and Product Lifecycle Management systems have been in use within the Navy and within commercial industries for the last decade. However, training data has always been disconnected from these systems. This causes readiness issues due to out-of-date and inefficient tracking and linking of technical curriculum to original product design.

Bringing training data in and managing that training content like any other piece of content on the digital thread has far-reaching impacts.

It is good for the Navy, and it's good for Defense as a whole, but the fact that we are using commercial based data standards to do this enables it to be applicable to organizations outside of the Military also. It would be just as applicable to commercial industry and companies, like Caterpillar or Saab for example.

GPSL's Role

When and why did GPSL become a part of the ARTT Team?

It all started back in August of 2018.

We really wanted to have PTC Windchill PLM expertise on the team, and PTC referred us to GPSL. Windchill is at the very heart of the Navy’s Model-Based Product Support (MBPS) initiative, and ARTT will be integrated into MBPS in October 2020.

Once onboard, GPSL became the technical lead for the integration of the training data with Windchill.

They were instrumental in setting up our surface and subsurface system use cases in ARTT. This enables us to demonstrate how training is configured to other data and also how training stakeholders can be instantly notified through workflow when content must be reviewed against an engineering change proposal.

GPSL’s Chris Moffett, Michael Friedman, Christina Bergling, and Scott Allshouse have been supporters of ARTT from the very beginning. It's really great to see how excited they are about extending the digital thread out to training, competencies, and even to credentials.

The Impact

What wider impact do you think the ARTT project will have?

I really believe that the next big market for data management is going to be technical training.

You need technically trained people to do their work on systems, and if there is no training, then the types of complex platforms that PTC Windchill and other PLM systems support will just not perform to the required expectations.

Having the data there from a maintenance perspective is all very well and good, but if you don't have the training data correct and ready for use in the classroom or on the job, then you have a large readiness issue.

We are now working on the ability to do performance measurement by looking at all of the different pieces of information in a maintenance task analysis that establish a standard or a degree of behavior.

For example, time: if we have a property that says it must be done in under thirty minutes, then we know that that becomes a baseline by which to measure someone's performance down the line.

This is critical for Defense, but is also something that is a high-value solution commercially across any industry where there's a complex piece of machinery that needs to be maintained either by the manufacturer or through the supply chain.

ARTT is also interfacing previously stove-piped data with machine-readable models.

For example, we have modelled manpower data, training data, and system data and linked them together in the World Wide Web Consortium’s Resource Description Framework (RDF) then encoded the content in Linked Open Data (LOD). RDF and LOD drive the semantic web, and to a large part, the economy.

So now we can tell you every credential in the Navy that belongs to a specific maintenance task analysis, and we can tell you every technical manual that belongs to a particular job description.

This type of data in machine-readable formats creates powerful analytical tools.

The Future

What does the future look like for ARTT?

There are endless possibilities.

I think one of the more interesting areas to explore is how we can better develop models, games, simulations, virtual environments, and augmented reality automatically from the same product data.

We know that we can output a SCORM course, we can output an IETM, we can output a PDF. Now, we can put our attention to automatically outputting a simulation or a model that's used for immersive learning.

This still has the very same issues as any other type of course that gets delivered. It's a configuration data item. If the authoritative sources change, then that immersive environment has to be reviewed for a design change just like any other piece of product data.

How can we use decision support tools to build simulations with components of those immersive environments digitally threaded back upstream to authoritative source?

More so, how do we ensure that the right performance measurement hooks are in place so when a learner engages with that immersive environment?

If we can capture their activity streams and add them into a learning record store, we can use that as a means to compare against baseline behavioural performance requirements.

For us to actually be thinking about that on ARTT is an indication of the excitement level that we have on our team while still making sure that we cover our bases now to make sure the fundamentals are in place.

Find out more

It’s the experience and knowledge amongst our team on the ARRT project that make the difference between a good team and a GREAT team.

Like all of our projects, this one started with a conversation.

If you would like to find out more about our involvement in ARRT or talk to use about your specific requirements, please get in touch with us today and let's have that initial conversation.

More about the ARTT Project

The ARTT project is prototyping the ability to create a “digital thread” linking product specifications, training development, and shipboard performance.

The data carried along this thread is expressed in terms of competencies (knowledge, skills, and abilities) required for maintenance tasks.

In the ARTT vision, competencies are extracted from standardized product data and fed forward into tools used for training analysis, design, and development.

When changes are made in the product specification, they create alerts that cause training to be updated. Meanwhile, data from maintenance logs is tied back to the competencies addressed in training and used to indicate which competencies have or have not been successfully trained.

This informs the nature and frequency of the training and supports a machine-aided change management process.

The History of the Project

While working at the Port Hueneme Naval Surface Warfare Center, Wayne Gafford worked in the Product Lifecycle Management group, which evolved into MBPS.

In July of 2017, he was asked to develop a Cooperative Research and Develop Agreement (CRADA) on how learning analytics and PLM systems can work together.

He attended the annual Advanced Distributed Learning conference. That year, the theme was Competency Management. During the conference, a panel of competency experts discussed the issues. One panellist, Jeanne Kitchenstalked about competencies mainly from a linked open data perspective.

In March of 2018, a detailed CRADA was signed. Wayne became the Navy Principal Investigator and Jeanne became the Industry Principal Investigator. They gathered a talented team and immediately started mapping GEIA 0007 and S3000L properties to Credential Engine’s Credential Transparency Description Language properties.

Those mappings became the first digital thread between maintenance task analysis properties and properties that describe credentials and competencies.


Relevant Articles