Writing UT A-Scans¶
To learn how to write UT A-scans to an .nde file, follow this simple procedure:
- JSON formatted Properties dataset:
- Create the dataset according to the Properties data model
- Validate the dataset structure against the Properties JSON Schema
- JSON formatted Setup dataset:
- Create the dataset according to the Setup data model
- Validate this JSON against the Setup JSON Schema
- Simulate or collect A-scans to be saved in a AScanAmplitude dataset
- Save datasets according to the .nde HDF5 structure
JSON formatted Properties dataset¶
Create the dataset¶
According to the Properties dataset data model documentation, the only required properties are the $schema
, creationDate
, and formatVersion
in the file object and the related method in the methods array.
The Properties dataset results in the following:
{
"$schema": "./Properties-Schema-4.0.0.json",
"file":{
"creationDate": "2024-10-16T20:28:30+01:00",
"formatVersion": "4.0.0"
},
"methods":["UT"]
}
Validate the dataset structure¶
Then, validate the JSON file against the Properties JSON Schema, assuming you saved the above Properties JSON under properties_ut_ascans.json
:
import fastjsonschema
import json
properties = json.load(open('properties_ut_ascans.json', 'r'))
properties_schema = json.load(open('NDE-Properties.json', 'r'))
properties_validator = fastjsonschema.compile(properties_schema)
try:
properties_validator(properties)
print('Properties JSON succeeded validation')
except fastjsonschema.JsonSchemaException as e:
print(f"Properties JSON failed validation: {e}")
The above validation should not return any error codes.
JSON formatted Setup dataset¶
Create the dataset¶
According to the Setup dataset data model documentation, there are a couple of objects we need to populate to end up with a valid dataset:
{
"$schema": "...",
"version": "...",
"scenario": "...",
"groups": [],
"acquisitionUnits": [],
"dataMappings": [],
"specimens": [],
"probes": [],
"wedges": []
}
The first three properties are straightforward and reference the schema version used to validate the JSON, the version of the file format, and the scenario used. As we are not scanning a weld, let's adopt the General Mapping scenario conventions:
The remaining objects and arrays cover:
- The definition of the groups, datasets, and processes: in our case, a single group, a
AScanAmplitude
dataset, and an ultrasonicConventional acquisition process - The definition of the acquisition unit: in the example below, the OmniScan X4 64x128
- The definition of a specimen: in the example below, a 25 mm thick mild steel plate
- The definition of the probe: in the example below, a C109 fingertip contact probe
- The definition of the wedge: as we will be in contact, we will define a wedge with most of its dimensions equal to 0
As the process of creating these JSON objects can be tedious the first time, the following template incorporates the above parameters to facilitate this demonstration.
Setup template
{
"$schema": "./Setup-Schema-4.0.0.json",
"version": "4.0.0",
"scenario": "General Mapping",
"groups": [
{
"id": 0,
"name": "GR-1",
"datasets": [
{
"id": 0,
"dataTransformations": [
{
"processId": 0
}
],
"dataClass": "AScanAmplitude",
"storageMode": "Paintbrush",
"dataValue": {
"min": -1,
"max": 1,
"unitMin": -100.0,
"unitMax": 100.0,
"unit": "Percent"
},
"path": "/Public/Groups/0/Datasets/0-AScanAmplitude",
"dimensions": [
{
"axis": "UCoordinate",
"offset": 0.0,
"quantity": 5,
"resolution": 0.001,
},
{
"axis": "VCoordinate",
"offset": 0.0,
"quantity": 1,
"resolution": 0.001
},
{
"axis": "Ultrasound",
"offset": 0.0,
"quantity": 3000,
"resolution": 1E-08
}
]
}
],
"processes": [
{
"inputs": [],
"outputs": [
{
"id": 0,
"datasetId": 0,
"dataClass": "AScanAmplitude"
}
],
"id": 0,
"implementation": "Hardware",
"ultrasonicConventional": {
"pulseEcho": {
"probeId": 0
},
"waveMode": "Longitudinal",
"velocity": 5890.0,
"wedgeDelay": 0.0,
"digitizingFrequency": 100000000.0,
"rectification": "None",
"beams": [
{
"id": 0,
"refractedAngle": 0.0,
"ascanStart": 0.0,
"ascanLength": 30E-6
}
]
}
}
]
}
],
"acquisitionUnits": [
{
"id": 0,
"platform": "X4",
"name": "MXU",
"model": "Orion_64x128",
"serialNumber": "QC-0090228",
"acquisitionRate": 120.0
}
],
"specimens": [
{
"id": 0,
"plateGeometry": {
"width": 0.3,
"length": 0.3,
"thickness": 0.025,
"surfaces": [
{
"id": 0,
"name": "Top"
}
],
"material": {
"name": "Steel_Mild",
"longitudinalWave": {
"nominalVelocity": 5890.0,
"attenuationCoefficient": 0.087
},
"transversalVerticalWave": {
"nominalVelocity": 3240.0,
"attenuationCoefficient": 0.174
},
"density": 7.8
}
}
}
],
"probes": [
{
"id": 0,
"model": "C109",
"serie": "CONTACT",
"conventionalRound": {
"centralFrequency": 5000000.0,
"diameter": 0.0127,
"elements": [
{
"id": 0,
"acquisitionUnitId": 0,
"connectorName": "P1"
}
]
},
"wedgeAssociation": {
"wedgeId": 0,
"mountingLocationId": 0
}
}
],
"wedges": [
{
"id": 0,
"model": "Contact",
"serie": "Default",
"angleBeamWedge": {
"width": 0.1,
"height": 0.1,
"length": 0.1,
"longitudinalVelocity": 2330.0,
"mountingLocations": [
{
"id": 0,
"wedgeAngle": 0.0,
"primaryOffset": -1E-05,
"secondaryOffset": 0.0,
"tertiaryOffset": 0.0
}
]
},
"positioning": {
"specimenId": 0,
"surfaceId": 0,
"uCoordinateOffset": 0.0,
"vCoordinateOffset": 0.0,
"skewAngle": 90.0
}
}
]
}
Validate the dataset structure¶
Then, validate the JSON file against the Setup JSON Schema, assuming you saved the above Setup JSON under setup_ut_ascans.json
:
setup = json.load(open('setup_ut_ascans.json', 'r'))
setup_schema = json.load(open('NDE-FileFormat-Schema-4.0.0.json', 'r'))
setup_validator = fastjsonschema.compile(setup_schema)
try:
setup_validator(setup)
print('Setup JSON succeeded validation')
except fastjsonschema.JsonSchemaException as e:
print(f"Setup JSON failed validation: {e}")
The above validation should not return any error codes.
Simulate or collect A-scans¶
We will generate 5 fake A-scans corresponding to 5 theoretical scanner positions. Each A-scan will have a 5 MHz center frequency, a 60 % bandwidth and 3000 samples. The fake A-scans will be stored in a 5 x 1 x 3000 numpy array named ascans
that we will use as our AScanAmplitude
type dataset.
As the generation of fake A-scans is out of the scope of this documentation, the example code below is provided for your convenience.
Fake A-scan generation
# Constants
sampling_rate = 100e6 # 100 MHz sampling rate
time_window = 30e-6 # 30 microseconds time window
frequency = 5e6 # 5 MHz center frequency
num_ascans = 5 # Number of A-Scans (U Axis size)
echo_amplitudes = [1, 0.7] # Relative amplitude of the two echoes
bandwidth = 0.6*frequency
gaussian_width = sigma_t = 1 / (2 * np.pi * bandwidth)
V_Axis = 1 # Size of the V axis
Ultrasound_Axis = round(time_window*sampling_rate)
# Time array
t = np.linspace(0, time_window, int(sampling_rate * time_window))
# Gaussian modulated sinusoid function
def gaussian_modulated_sinusoid(t, center, frequency, width):
gauss = np.exp(-((t - center) ** 2) / (2 * width ** 2))
sinusoid = np.sin(2 * np.pi * frequency * (t - center))
return gauss * sinusoid
# Generate A-Scans with varying echo positions
ascans = np.empty(shape=(num_ascans,V_Axis,Ultrasound_Axis))
for i in range(num_ascans):
# Echo positions slightly shifted for each A-Scan
echo_positions = [10e-6 + i * 0.1e-6, 20e-6 + i * 0.1e-6]
# Generate A-Scan
ascan = np.zeros_like(t)
for echo_pos, amplitude in zip(echo_positions, echo_amplitudes):
ascan += amplitude * gaussian_modulated_sinusoid(t, echo_pos, frequency, gaussian_width)
ascans[i, 0, :] = ascan
Save the datasets according to the .nde HDF5 structure¶
We now need to create the .nde file using the HDF5 library and according to the HDF5 Structure specific to any .nde file. The Properties JSON will be saved at the root of this structure, the Setup JSON will be saved under the /Public/
path, and the A-scans will be saved in Group 0 under /Public/Groups/0/Datasets
.
with h5py.File('ut_ascans.nde', 'w') as hdf5_file:
# Create the file structure
public_section = hdf5_file.create_group('Public')
groups_section = public_section.create_group('Groups')
group_0 = groups_section.create_group('0')
group_0_datasets = group_0.create_group('Datasets')
# Convert the JSON data to a string
setup_string = json.dumps(setup, indent=4)
properties_string = json.dumps(properties, indent=4)
# Save the JSON data under the respective path
hdf5_file.create_dataset('Public/Setup', data=setup_string, dtype=h5py.string_dtype('utf-8', len(setup_string)))
hdf5_file.create_dataset('Properties', data=properties_string, dtype=h5py.string_dtype('utf-8', len(properties_string)))
# Save the NumPy ascans under Group 0 datasets with AScanAmplitude DataClass
group_0_datasets.create_dataset('0-AScanAmplitude', data=ascans)
print("Data successfully saved to .NDE")
You should end up with the following file: