How to manage Conduit piping

From 42Q
Jump to navigation Jump to search

Introduction

The purpose of this document is to teach the user how to use the pipe parameter operation in a Conduit request as well as how to set up the command’s parameters.

Conduit can be thought of as an executor of distinct Operations. Each operation is a command or a macro encapsulating some behavior into a named entity. Many operations are designed to directly manipulate the device history record (DHR) of a serial number, but ultimately an operation can exhibit nearly any behavior deemed useful and/or relevant to production manufacturing. Each operation has a number of prompts or parameters required in order to execute successfully, that number ranging from zero (no parameters at all) to a dozen or more.

Conduit’s primary design methodology has been to create a large number of relatively simple, well-defined operations specific to a single action. By composing multiple operations into a single request, clients can perform complicated tasks using these relatively simple building blocks.

By default, Conduit supports blocks or sets of operations in a single request. Performing lots of “work” is as simple as sending as many operations as required to adjust (append to) the unit’s history accordingly. However, in some cases, users may want to allow some level of interoperability between the discrete operations. More concretely, base the inputs of operation N using information obtained by executing operation N - 1. Since each operation is its own miniature “program” or function users need a way to facilitate feeding or sharing data from one operation into another operation.

The previously mentioned is achieved by using Pipe operations, more specifically PipeParameter.

Before outlining the actual usage of PipeParameter, there is one thing to keep in mind: PipeParameter is really just a way to avoid making multiple, small requests to Conduit. Anything you can do with Piping could be mimicked in a client application by asking Conduit to execute a single operation, take the results of that operation and then construct a follow-up call with the extracted fields. This is obviously cumbersome and likely very inefficient, but that is all in terms of the PipeParameter operation. We’re linking or gluing a composed set of operations together so we can reduce round trips to Conduit and the need for implementing complicated client logic.

 

 

 

 

Pipe Parameter

PipeParameter provides a mechanism to bypass the typical prompting requirements of a Conduit operation. Meaning that it enables clients to feed/push/pipe data from some source field into a destination field on a subsequent, non-pipe operation. Pipe Parameter only interacts with NON-PIPE operations. This means sequential PipeParameter calls act as if the adjacent PipeParameter calls aren’t present. The from_field (data) source for a PipeParameter call is the last NON-PIPE operation executed before the PipeParameter call and the destination to_field is on the first NON-PIPE operation following the PipeParameter call. PipeParameter itself has 3 parameters driving its execution: Source, From Field and To Field.
 

Figure 1: Three Parameters Of PipeParameter

HTMCP Three Parameters Of PipeParameter.png

Testmacro.png

 

HT PipeP.png

 

Source

The source field indicates the PipeParameter operation where to look for the piece of information identified by the from_field parameter. At this time the possible values for source are: Data (data), Data Pointer (data_ptr), Input Parameter (input_parameter) and Scanned Unit (scanned_unit).

 

Data (data)

The default source when not provided. Using source data means the Pipe Parameter call will seek out the specified from_field in the execution results of the last NON-PIPE operation executed before this call. Note that not all operations produce meaningful data, and there is not currently a published schema of the data every operation produces (if any is produced at all).

 

Data Pointer (data_ptr)

It is identical to data but the from_field is specified using the JSON Pointer syntax.

 

Input Parameter (input_parameter)

Take the from_field input parameter of the previously executed NON-PIPE operation and push that value into the to_field of the closest NON-PIPE operation.

 

Scanned Unit (scanned_unit)

The from_field, specified using JSON pointer syntax, is taken from the snapshot of unit information Conduit gathers while scanning the transaction unit supplied during the Conduit request. At the moment we have not published a formal list of fields available in the scanned unit snapshot but quite a bit of data is gathered during the process of scanning the transaction unit.
 

From Field

Within the specified source, identify the value associated with the from_field name.

 

To Field

Populate the to_field name with the value extracted from source-> from_field as if the client had explicitly populated that value when constructing the call in the request.

 

Macros

The macro mechanism in Conduit and the command register allow administrators to encapsulate custom behaviors into a single named entity. This includes removing optional fields not relevant for that use case and overriding prompts to better fit the specific manufacturing domain. Many “built-in” macros take advantage of piping, including:
StartDefectReprocessing
RESCAN
 

Example Cases

How to use CommandName and AddComment

The easiest way to understand using Pipe Parameter is by example. We’ll start with a few simple cases then move on to more exotic and/or tedious applications.  

In order to describe an instance of a Conduit operation we will use the following syntax:

CommandName <field_name1: value> <field_name2: value2> <field_nameN: valueN>

 

Or for a call to AddComment:

AddComment <comment_text: This is some text>

 

Just to be clear, a Conduit command is actually a JSON object with at least a field called name and any other parameters provided, so the actual AddComment command above could be modeled as:

{“name”: “AddComment”, “comment_text”: “This is some text”}
 

Figure 2: CommandName and AddComment Example

HTMCP CommandName and AddComment Example.png

 

 

How to use Source: data

The simplest possible pipe parameter scenario. AddMeasurementKey produces a data field called measurement_key which we Pipe to the comment_text field on the AddComment operation.

PipeParameter <from_field: measurement_key> <to_field: comment_text>
AddComment

 

Figure 3: Source data Example

HTMCP Source data Example.png

 

 

How to use Source: scanned_unit

The ScannedUnit produced during the scanning process has a large dataset, including an active list of unremoved attributes. Here we make use of JSON pointer syntax to access the attr_data field of the first element of the attributes array under the unit_elements root element.

PipeParameter <from_field: /unit_elements/attributes/0/attr_data> <source: scanned_unit> <to_field: comment_text>
AddComment

 

Figure 4: Source scanned_unit Example

HTMCP Source scanned unit Example.png

 

 

How to use Source: input_parameter

Sometimes we want to pass the same value provided to one operation to a later operation in the transaction instead of prompting for a piece of data twice.  Here we piped the defect_code input parameter from the first RecordDefect call to the ref_designator field of the 2nd RecordDefect call.

RecordDefect <defect_code: 0101> <ref_designator: N/A>   
PipeParameter <from_field: defect_code> <source: input_parameter> <to_field: ref_designator> 
RecordDefect <defect_code: 0102>   

 

Figure 5: Input Parameter Example

HTMCP Input Parameter Example.png

 

 

How to use Source: data_ptr

The JSON pointer version of the previous NON-PIPE operation results. The plain data source version tries very hard to identify the from_field, including iterating through various elements.  The data_ptr version allows you to explicitly identify the from_field if that is necessary. As you can see, the plain data source version is more approachable in most cases as the data_ptr usage requires explicit knowledge of the data results produced by the operation.


AddMeasurementKey
PipeParameter <from_field: /0/data/measurement_key> <source: data_ptr> <to_field: comment_text>
AddComment

 

 

Figure 6: Source data_ptr Example

HTMCP Source data ptr Example.png

 

 

How to use ‘Piggy-Back’ Input Parameter

Sometimes we may want to propagate the results of an operation or operations executed earlier in a set of commands to operations much later in the transaction.  As soon as we need to bridge across multiple NON-PIPE operations we need to get creative.  Each operation has a known list of field names relevant to its execution as defined in the client-command-registry.  This means that any fields on that operation outside this list are effectively ignored. 

That means, we can pipe fields unrelated to an operation to a field name that operation doesn’t care about, then after executing that intermediate operation we can pipe that unrelated field further down the list of calls.  This allows us to percolate meaningful data from an earlier call all the way to the last operation in a transaction.

Note: Piggyback is an example of a ‘to_field’ that won’t collide with any field name the intermediate operation(s) might consume during execution.

 

ReadFlexField <flex_field: board_label_name> <reference_field: part_number> <reference_table: part> <reference_value: ~CDIAGPART001~>
PipeParameter <from_field: flex_value> <to_field: piggyback>                                                                                               
ReadFlexField <flex_field: board_algorithm> <reference_field: part_number> <reference_table: part> <reference_value: ~CDIAGPART001~>       
PipeParameter <from_field: piggyback> <source: input_parameter> <to_field: attr_name>                                 
PipeParameter <from_field: flex_value> <to_field: attr_data>            
AddAttribute    

 

Figure 7: ‘Piggy-Back’ Input Parameter Example

HTMCP ‘Piggy-Back’ Input Parameter Example.png

 

 

 

How to use Replace Tree Component

Find a tree component attached somewhere in the scanned unit’s component hierarchy and replace that component.

 

PipeParameter <from_field: /serial_number> <source: scanned_unit> <to_field: unit_serial_number>

FindUnitTreeComponent <not_found_error_message: Unable to identify component to remove> 

PipeParameter <from_field: parent_serial_number> <source: data> <to_field: unit_serial_number>

PipeParameter <from_field: component_id> <source: input_parameter> <to_field: old_component_id>    
ReplaceUnitComponent

 

Figure 8: Replace Tree Component Example

HTMCP Replace Tree Component Example.png

 

 

 

How to render, store and record media

RenderLabel <label_name: le_24x_test_dup_allowed> <test_print: 0>             
PipeParameter <from_field: rendered_file> <source: data> <to_field: payload_file>                                  
PipeParameter <from_field: content_type> <source: data> <to_field: mime_type>                                                                                                                                                                                                                                          
PipeParameter <from_field: original_filename> <source: data> <to_field: original_filename>                                                                                                                                                                                                                             
StoreMedia <media_alias: Weir Hardness Certificate Testing> <media_description: Rendered Label> <media_tag: CPI> <storage_level: site>                                                                                                                                                      
PipeParameter <from_field: identifier> <source: data> <to_field: media_identifier>                               
RecordMedia  

 

Figure 9: Render, Store And Record Media Example

HW pipetest.png