Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MTS workflow high-level summary #142

Open
Kvieta1990 opened this issue May 30, 2022 · 0 comments
Open

MTS workflow high-level summary #142

Kvieta1990 opened this issue May 30, 2022 · 0 comments
Assignees
Labels
Priority: High High priority feature / fix Type: Docs Updates or fixes to documentation

Comments

@Kvieta1990
Copy link
Collaborator

Kvieta1990 commented May 30, 2022

mantidtotalscattering is a modern engine for reducing neutron total scattering data based on Mantid framework. It is a high-level framework which utilizes various low-level Mantid algorithms, going through all the necessary steps to bring the raw neutron time-of-flight scattering data to user-end analyzable data. By going through the whole workflow and applying a series of necessary corrections, it is expected that the end product is the neutron total scattering data sitting on an absolute scale.

Current issue contains a high-level summary for the overall workflow of mantidtotalscattering -- to be used as the backbone of the paper to report the modern interface for reducing neutron total scattering data.

Current repository contains the source code of mantidtotalscattering, including the main workflow together with multiple module files for various utilities used in the main workflow. Source codes concerning data reduction is mainly included in the following directory,

https://github.com/neutrons/mantid_total_scattering/tree/next/total_scattering

and the main workflow script is,

https://github.com/neutrons/mantid_total_scattering/blob/next/total_scattering/reduction/total_scattering_reduction.py.

  1. The main workflow starts from here,

    def TotalScatteringReduction(config: dict = None):

  2. First, read in general input parameters from the input json file,

    if config is None:
    raise RuntimeError('Argument config cannot be None')
    facility = config['Facility']
    title = config['Title']
    instr = config['Instrument']
    # Get an instance to Mantid's logger
    log = Logger("TotalScatteringReduction")
    # Message to be presented at the very end of the reduction via the logger.
    final_message = ''
    # Get sample info
    sample = get_sample(config)
    sam_mass_density = sample.get('MassDensity', None)
    sam_packing_fraction = sample.get('PackingFraction', None)
    sam_geometry = sample.get('Geometry', None)
    sam_material = sample.get('Material', None)
    sam_geo_dict = {'Shape': 'Cylinder',
    'Radius': config['Sample']['Geometry']['Radius'],
    'Height': config['Sample']['Geometry']['Height']}
    sam_mat_dict = {'ChemicalFormula': sam_material,
    'SampleMassDensity': sam_mass_density}
    if 'Environment' in config:
    sam_env_dict = {'Name': config['Environment']['Name'],
    'Container': config['Environment']['Container']}
    else:
    sam_env_dict = {'Name': 'InAir',
    'Container': 'PAC06'}
    abs_cache_fn = sam_mat_dict["ChemicalFormula"].replace(" ", "_")
    tmp_fn = "_md_" + str(sam_mat_dict['SampleMassDensity']).replace(".", "p")
    abs_cache_fn += tmp_fn
    abs_cache_fn += ("_pf_" + str(sam_packing_fraction).replace(".", "p"))
    abs_cache_fn += ("_sp_" + sam_geo_dict['Shape'])
    abs_cache_fn += ("_r_" + str(sam_geo_dict['Radius']).replace(".", "p"))
    abs_cache_fn += ("_h_" + str(sam_geo_dict['Height']).replace(".", "p"))
    abs_cache_fn += ("_env_" + sam_env_dict['Name'])
    abs_cache_fn += ("_cont_" + sam_env_dict['Container'])
    # Get normalization info
    van = get_normalization(config)
    van_mass_density = van.get('MassDensity', None)
    van_packing_fraction = van.get('PackingFraction', 1.0)
    van_geometry = van.get('Geometry', None)
    van_material = van.get('Material', 'V')
    van_geo_dict = {'Shape': 'Cylinder',
    'Radius': config['Normalization']['Geometry']['Radius'],
    'Height': config['Normalization']['Geometry']['Height']}
    van_mat_dict = {'ChemicalFormula': van_material,
    'SampleMassDensity': van_mass_density}
    abs_cache_fn_v = van_mat_dict["ChemicalFormula"].replace(" ", "_")
    tmp_fn = "_md_" + str(van_mat_dict['SampleMassDensity']).replace(".", "p")
    abs_cache_fn_v += tmp_fn
    abs_cache_fn_v += ("_pf_" + str(van_packing_fraction).replace(".", "p"))
    abs_cache_fn_v += ("_sp_" + van_geo_dict['Shape'])
    abs_cache_fn_v += ("_r_" + str(van_geo_dict['Radius']).replace(".", "p"))
    abs_cache_fn_v += ("_h_" + str(van_geo_dict['Height']).replace(".", "p"))
    abs_cache_fn_v += ("_env_" + sam_env_dict['Name'])
    # Get calibration, characterization, and other settings
    merging = config['Merging']
    binning = merging['QBinning']
    characterizations = merging.get('Characterizations', None)
    # Get the self scattering option for each bank
    self_scattering_level_correction = get_self_scattering_level(config,
    binning[2])
    if not isinstance(self_scattering_level_correction, dict):
    raise RuntimeError()
    # Get Resonance filter configuration
    res_filter = config.get('ResonanceFilter', None)
    if res_filter is not None:
    res_filter_axis = res_filter.get('Axis', None)
    res_filter_lower = res_filter.get('LowerLimits', None)
    res_filter_upper = res_filter.get('UpperLimits', None)
    # Grouping
    grouping = merging.get('Grouping', None)
    cache_dir = config.get("CacheDir", os.path.abspath('.'))
    OutputDir = config.get("OutputDir", os.path.abspath('.'))
    # Create Nexus file basenames
    sample['Runs'] = expand_ints(sample['Runs'])
    sample['Background']['Runs'] = expand_ints(
    sample['Background'].get('Runs', None))
    '''
    Currently not implemented:
    # wkspIndices = merging.get('SumBanks', None)
    # high_q_linear_fit_range = config['HighQLinearFitRange']
    POWGEN options not used
    #alignAndFocusArgs['RemovePromptPulseWidth'] = 50
    # alignAndFocusArgs['CompressTolerance'] use defaults
    # alignAndFocusArgs['UnwrapRef'] POWGEN option
    # alignAndFocusArgs['LowResRef'] POWGEN option
    # alignAndFocusArgs['LowResSpectrumOffset'] POWGEN option
    How much of each bank gets merged has info here in the form of
    # {"ID", "Qmin", "QMax"}
    # alignAndFocusArgs['CropWavelengthMin'] from characterizations file
    # alignAndFocusArgs['CropWavelengthMax'] from characterizations file
    '''
    #################################################################
    # Figure out experimental runs either with run numbers
    # and facility name or retrieve file name from 'config'
    #
    # including
    # sample, sample background,
    # container, container background,
    # vanadium, vanadium background
    #################################################################
    if facility == 'SNS':
    facility_file_format = '%s_%d'
    else:
    facility_file_format = '%s%d'
    sam_scans = ','.join([facility_file_format % (instr, num)
    for num in sample['Runs']])
    container_scans = ','.join([facility_file_format % (instr, num)
    for num in sample['Background']["Runs"]])
    container_bg = None
    if "Background" in sample['Background']:
    sample['Background']['Background']['Runs'] = expand_ints(
    sample['Background']['Background']['Runs'])
    container_bg = ','.join([facility_file_format % (
    instr, num) for num in sample['Background']['Background']['Runs']])
    if len(container_bg) == 0:
    container_bg = None
    van['Runs'] = expand_ints(van['Runs'])
    van_scans = ','.join([facility_file_format % (instr, num)
    for num in van['Runs']])
    van_bg_scans = None
    if 'Background' in van:
    van_bg_scans = van['Background']['Runs']
    van_bg_scans = expand_ints(van_bg_scans)
    van_bg_scans = ','.join([facility_file_format %
    (instr, num) for num in van_bg_scans])
    # Override Nexus file basename with Filenames if present
    if "Filenames" in sample:
    sam_scans = ','.join(sample["Filenames"])
    if "Filenames" in sample['Background']:
    container_scans = ','.join(sample['Background']["Filenames"])
    if "Background" in sample['Background']:
    if "Filenames" in sample['Background']['Background']:
    container_bg = ','.join(
    sample['Background']['Background']['Filenames'])
    if "Filenames" in van:
    van_scans = ','.join(van["Filenames"])
    if "Background" in van:
    if "Filenames" in van['Background']:
    van_bg_scans = ','.join(van['Background']["Filenames"])
    # Output nexus filename
    nexus_filename = title + '.nxs'
    try:
    os.remove(os.path.join(OutputDir, nexus_filename))
    msg = "Old NeXusfile found. Will delete it."
    msg1 = "Old NeXus file: {}"
    log.notice(msg)
    log.notice(msg1.format(os.path.join(OutputDir, nexus_filename)))
    except OSError:
    msg = "Old NeXus file not found. Moving forward."
    log.notice(msg)
    pass
    #################################################################
    # Process absorption, multiple scattering and inelastic
    # correction setup from input 'config'
    # and
    # Create absorption workspace for
    # - sample and container
    # - vanadium
    #################################################################
    # Get sample corrections
    new_abs_methods = ['SampleOnly', 'SampleAndContainer', 'FullPaalmanPings']
    sam_abs_corr = sample.get("AbsorptionCorrection", None)
    sam_ms_corr = sample.get("MultipleScatteringCorrection", None)
    sam_inelastic_corr = SetInelasticCorrection(
    sample.get('InelasticCorrection', None))
    # get the element size
    sam_abs_ms_param = sample.get("AbsMSParameters", None)
    sam_elementsize = 1.0 # mm
    con_elementsize = 1.0 # mm
    if sam_abs_ms_param:
    elementsize = sam_abs_ms_param.get("ElementSize", 1.0)
    if type(elementsize) == list:
    sam_elementsize = elementsize[0]
    con_elementsize = elementsize[1]
    else:
    sam_elementsize = elementsize
    con_elementsize = elementsize
    # Compute the absorption correction on the sample if it was provided
    sam_abs_ws = ''
    con_abs_ws = ''
    if sam_abs_corr:
    if sam_abs_corr["Type"] in new_abs_methods:
    msg = "Applying '{}' absorption correction to sample"
    log.notice(msg.format(sam_abs_corr["Type"]))
    sam_ms_method = None
    if sam_ms_corr:
    sam_ms_method = sam_ms_corr.get("Type", None)
    if sam_ms_method is not None:
    log.notice(
    f"Apply {sam_ms_method} multiple scattering correction"
    "to sample"
    )
    ipts = GetIPTS(Instrument='NOM',
    RunNumber=sam_scans.split(",")[0].split("_")[1])
    abs_cache_fn_s = abs_cache_fn + "_s.nxs"
    abs_cache_fn_c = abs_cache_fn + "_c.nxs"
    central_cache_f_s = os.path.join(ipts, "shared/caching",
    abs_cache_fn_s)
    central_cache_f_c = os.path.join(ipts, "shared/caching",
    abs_cache_fn_c)
    f_s_exists = os.path.exists(central_cache_f_s)
    f_c_exists = os.path.exists(central_cache_f_c)
    redo_cond_1 = not f_s_exists
    so = sam_abs_corr["Type"] == "SampleOnly"
    redo_cond_2 = f_s_exists and (not f_c_exists) and (not so)
    if redo_cond_1 or redo_cond_2:
    sam_abs_ws, con_abs_ws = create_absorption_wksp(
    sam_scans,
    sam_abs_corr["Type"],
    sam_geo_dict,
    sam_mat_dict,
    sam_env_dict,
    ms_method=sam_ms_method,
    elementsize=sam_elementsize,
    con_elementsize=con_elementsize,
    **config)
    # save abs workspaces to cached file
    if not os.path.exists(os.path.join(ipts, "shared/caching")):
    os.makedirs(os.path.join(ipts, "shared/caching"))
    SaveNexus(InputWorkspace=sam_abs_ws,
    Filename=central_cache_f_s)
    if con_abs_ws != "":
    SaveNexus(InputWorkspace=con_abs_ws,
    Filename=central_cache_f_c)
    else:
    # read cached abs workspaces
    msg = "Cached absorption file found for sample."
    msg += " Will load and use it."
    log.notice(msg)
    sam_abs_ws = LoadNexus(Filename=central_cache_f_s)
    if f_c_exists:
    msg = "Cached absorption file found for container."
    msg += " Will load and use it."
    log.notice(msg)
    con_abs_ws = LoadNexus(Filename=central_cache_f_c)
    else:
    con_abs_ws = ""
    # Get vanadium corrections
    van_mass_density = van.get('MassDensity', van_mass_density)
    # FIXME - van_packing_fraction is specified but not used
    van_packing_fraction = van.get(
    'PackingFraction',
    van_packing_fraction)
    van_abs_corr = van.get("AbsorptionCorrection", {"Type": None})
    van_ms_corr = van.get("MultipleScatteringCorrection", {"Type": None})
    van_inelastic_corr = SetInelasticCorrection(
    van.get('InelasticCorrection', None))
    # get the elementsize for vanadium
    van_abs_ms_param = van.get("AbsMSParameters", None)
    van_elementsize = 1.0
    if van_abs_ms_param:
    van_elementsize = van_abs_ms_param.get("ElementSize", 1.0)
    # Compute the absorption correction for the vanadium if provided
    van_abs_corr_ws = ''
    if van_abs_corr:
    if van_abs_corr["Type"] in new_abs_methods:
    msg = "Applying '{}' absorption correction to vanadium"
    log.notice(msg.format(van_abs_corr["Type"]))
    van_ms_method = None
    if van_ms_corr:
    van_ms_method = van_ms_corr.get("Type", None)
    if van_ms_method is not None:
    log.notice(
    f"Apply {van_ms_method} multiple scattering correction"
    "to vanadium"
    )
    abs_cache_fn_v += "_vanadium.nxs"
    central_cache_f_v = os.path.join("/SNS/NOM/shared/mts_abs_ms",
    abs_cache_fn_v)
    if not os.path.exists(central_cache_f_v):
    van_abs_corr_ws, van_con_ws = create_absorption_wksp(
    van_scans,
    van_abs_corr["Type"],
    van_geo_dict,
    van_mat_dict,
    ms_method=van_ms_method,
    elementsize=van_elementsize,
    **config)
    SaveNexus(InputWorkspace=van_abs_corr_ws,
    Filename=central_cache_f_v)
    else:
    msg = "Cached absorption file found for vanadium."
    msg += " Will load and use it."
    log.notice(msg)
    van_abs_corr_ws = LoadNexus(Filename=central_cache_f_v)
    #################################################################
    # Set up parameters for AlignAndFocus
    # and
    # Create calibration, mask and grouping workspace
    #################################################################
    alignAndFocusArgs = dict()
    alignAndFocusArgs['CalFilename'] = config['Calibration']['Filename']
    # alignAndFocusArgs['GroupFilename'] don't use
    # alignAndFocusArgs['Params'] = "0.,0.02,40."
    alignAndFocusArgs['ResampleX'] = -6000
    alignAndFocusArgs['Dspacing'] = False
    alignAndFocusArgs['PreserveEvents'] = False
    alignAndFocusArgs['MaxChunkSize'] = 8
    alignAndFocusArgs['CacheDir'] = os.path.abspath(cache_dir)
    # add resonance filter related properties
    # NOTE:
    # the default behaivor is no filtering if not specified.
    if res_filter is not None:
    alignAndFocusArgs['ResonanceFilterUnits'] = res_filter_axis
    alignAndFocusArgs['ResonanceFilterLowerLimits'] = res_filter_lower
    alignAndFocusArgs['ResonanceFilterUpperLimits'] = res_filter_upper
    # Get any additional AlignAndFocusArgs from JSON input
    if "AlignAndFocusArgs" in config:
    otherArgs = config["AlignAndFocusArgs"]
    alignAndFocusArgs.update(otherArgs)
    # Setup grouping
    output_grouping = False
    grp_wksp = "wksp_output_group"
    if grouping:
    if 'Initial' in grouping:
    if grouping['Initial'] and not grouping['Initial'] == u'':
    alignAndFocusArgs['GroupFilename'] = grouping['Initial']
    if 'Output' in grouping:
    if grouping['Output'] and not grouping['Output'] == u'':
    output_grouping = True
    LoadDetectorsGroupingFile(InputFile=grouping['Output'],
    OutputWorkspace=grp_wksp)
    # If no output grouping specified, create it with Calibration Grouping
    if not output_grouping:
    LoadDiffCal(alignAndFocusArgs['CalFilename'],
    InstrumentName=instr,
    WorkspaceName=grp_wksp.replace('_group', ''),
    MakeGroupingWorkspace=True,
    MakeCalWorkspace=False,
    MakeMaskWorkspace=False)
    # Setup the 6 bank method if no grouping specified
    if not grouping:
    CreateGroupingWorkspace(InstrumentName=instr,
    GroupDetectorsBy='Group',
    OutputWorkspace=grp_wksp)
    alignAndFocusArgs['GroupingWorkspace'] = grp_wksp

  3. Specifically, at the initial stage, we want to worry about absorption and multiple scattering,

    # Compute the absorption correction on the sample if it was provided
    sam_abs_ws = ''
    con_abs_ws = ''
    if sam_abs_corr:
    if sam_abs_corr["Type"] in new_abs_methods:
    msg = "Applying '{}' absorption correction to sample"
    log.notice(msg.format(sam_abs_corr["Type"]))
    sam_ms_method = None
    if sam_ms_corr:
    sam_ms_method = sam_ms_corr.get("Type", None)
    if sam_ms_method is not None:
    log.notice(
    f"Apply {sam_ms_method} multiple scattering correction"
    "to sample"
    )
    ipts = GetIPTS(Instrument='NOM',
    RunNumber=sam_scans.split(",")[0].split("_")[1])
    abs_cache_fn_s = abs_cache_fn + "_s.nxs"
    abs_cache_fn_c = abs_cache_fn + "_c.nxs"
    central_cache_f_s = os.path.join(ipts, "shared/caching",
    abs_cache_fn_s)
    central_cache_f_c = os.path.join(ipts, "shared/caching",
    abs_cache_fn_c)
    f_s_exists = os.path.exists(central_cache_f_s)
    f_c_exists = os.path.exists(central_cache_f_c)
    redo_cond_1 = not f_s_exists
    so = sam_abs_corr["Type"] == "SampleOnly"
    redo_cond_2 = f_s_exists and (not f_c_exists) and (not so)
    if redo_cond_1 or redo_cond_2:
    sam_abs_ws, con_abs_ws = create_absorption_wksp(
    sam_scans,
    sam_abs_corr["Type"],
    sam_geo_dict,
    sam_mat_dict,
    sam_env_dict,
    ms_method=sam_ms_method,
    elementsize=sam_elementsize,
    con_elementsize=con_elementsize,
    **config)
    # save abs workspaces to cached file
    if not os.path.exists(os.path.join(ipts, "shared/caching")):
    os.makedirs(os.path.join(ipts, "shared/caching"))
    SaveNexus(InputWorkspace=sam_abs_ws,
    Filename=central_cache_f_s)
    if con_abs_ws != "":
    SaveNexus(InputWorkspace=con_abs_ws,
    Filename=central_cache_f_c)
    else:
    # read cached abs workspaces
    msg = "Cached absorption file found for sample."
    msg += " Will load and use it."
    log.notice(msg)
    sam_abs_ws = LoadNexus(Filename=central_cache_f_s)
    if f_c_exists:
    msg = "Cached absorption file found for container."
    msg += " Will load and use it."
    log.notice(msg)
    con_abs_ws = LoadNexus(Filename=central_cache_f_c)
    else:
    con_abs_ws = ""
    # Get vanadium corrections
    van_mass_density = van.get('MassDensity', van_mass_density)
    # FIXME - van_packing_fraction is specified but not used
    van_packing_fraction = van.get(
    'PackingFraction',
    van_packing_fraction)
    van_abs_corr = van.get("AbsorptionCorrection", {"Type": None})
    van_ms_corr = van.get("MultipleScatteringCorrection", {"Type": None})
    van_inelastic_corr = SetInelasticCorrection(
    van.get('InelasticCorrection', None))
    # get the elementsize for vanadium
    van_abs_ms_param = van.get("AbsMSParameters", None)
    van_elementsize = 1.0
    if van_abs_ms_param:
    van_elementsize = van_abs_ms_param.get("ElementSize", 1.0)
    # Compute the absorption correction for the vanadium if provided
    van_abs_corr_ws = ''
    if van_abs_corr:
    if van_abs_corr["Type"] in new_abs_methods:
    msg = "Applying '{}' absorption correction to vanadium"
    log.notice(msg.format(van_abs_corr["Type"]))
    van_ms_method = None
    if van_ms_corr:
    van_ms_method = van_ms_corr.get("Type", None)
    if van_ms_method is not None:
    log.notice(
    f"Apply {van_ms_method} multiple scattering correction"
    "to vanadium"
    )
    abs_cache_fn_v += "_vanadium.nxs"
    central_cache_f_v = os.path.join("/SNS/NOM/shared/mts_abs_ms",
    abs_cache_fn_v)
    if not os.path.exists(central_cache_f_v):
    van_abs_corr_ws, van_con_ws = create_absorption_wksp(
    van_scans,
    van_abs_corr["Type"],
    van_geo_dict,
    van_mat_dict,
    ms_method=van_ms_method,
    elementsize=van_elementsize,
    **config)
    SaveNexus(InputWorkspace=van_abs_corr_ws,
    Filename=central_cache_f_v)
    else:
    msg = "Cached absorption file found for vanadium."
    msg += " Will load and use it."
    log.notice(msg)
    van_abs_corr_ws = LoadNexus(Filename=central_cache_f_v)

    The absorption and multiple scattering corrections are implemented in a single-shot workflow and will be applied to the loaded raw data through a comprehensive donor workspace. Therefore, in the overall workflow, the donor workspace will be constructed first, including the calculated absorption coefficients and/or multiple scattering factors per specification in the input json file. When loading in the data, the donor workspace will be passed to the loader and applied directly to the loaded data. One thing to point out is, since we have the complication of container subtraction, the calculated absorption coefficients and/or multiple scattering factors are applied to both sample and container in a compact way.

    The overall summary for the application of the absorption and/or multiple scattering corrections can be found in the following document,

    https://latex.ornl.gov/read/pgpqpfkygtpd

    The compact implementation of the effective coefficients is summarized in the following comment lines in the source code,

    For sample and container:

    # -------------------------------- #
    # Load Sample Container Background #
    # -------------------------------- #
    # NOTE: sample container background IS the empty instrument
    # The full formula
    # alpha_s(I_s - I_e) - alpha_c(I_c - I_e)
    # I = ----------------------------------------
    # alpha_v (I_v - I_v,e)
    #
    # alpha_s I_s - alpha_c I_c - alpha_e I_e
    # = -----------------------------------------
    # alpha_v I_v - alpha_v I_v,e
    # where
    # A_c - A_s
    # alpha_e = alpha_s - alpha_c = 1/A_s - 1/A_c = -------------
    # A_s * A_c

    For vanadium and empty instrument:

    # ------------------------ #
    # Load Vanadium Background #
    # ------------------------ #
    # NOTE:
    # The full formula
    # alpha_s(I_s - I_e) - alpha_c(I_c - I_e)
    # I = ----------------------------------------
    # alpha_v (I_v - I_v,e)
    #
    # alpha_s I_s - alpha_c I_c - alpha_e I_e
    # = -----------------------------------------
    # alpha_v I_v - alpha_v I_v,e
    #
    # where
    # * I_v,e is vanadium background
    # * alpha_e = (alpha_s - alpha_c)
    #
    # ALSO, alpha is the inverse of [effective] absorption coefficient, i.e.
    # alpha = 1/A

    When multiple scattering correction is enabled, the effective coefficients mentioned in those comment lines above contain the contribution from both absorption and multiple scattering effects.

  4. Load in sample and container, apply absorption correction for each (following the rationale presented above),

    #################################################################
    # Load, calibrate and diffraction focus
    # (1) sample (2) container (3) container background
    # (4) vanadium (5) vanadium background
    #################################################################
    # TODO take out the RecalculatePCharge in the future once tested
    # Load Sample
    print("#-----------------------------------#")
    print("# Sample")
    print("#-----------------------------------#")
    sam_wksp = load(
    'sample',
    sam_scans,
    sam_geometry,
    sam_material,
    sam_mass_density,
    sam_abs_ws,
    **alignAndFocusArgs)
    sample_title = "sample_and_container"
    save_banks(InputWorkspace=sam_wksp,
    Filename=nexus_filename,
    Title=sample_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    sam_molecular_mass = mtd[sam_wksp].sample(
    ).getMaterial().relativeMolecularMass()
    natoms = getNumberAtoms(
    sam_packing_fraction,
    sam_mass_density,
    sam_molecular_mass,
    Geometry=sam_geometry)
    # Load Sample Container
    print("#-----------------------------------#")
    print("# Sample Container")
    print("#-----------------------------------#")
    container = load(
    'container',
    container_scans,
    absorption_wksp=con_abs_ws,
    **alignAndFocusArgs)
    save_banks(
    InputWorkspace=container,
    Filename=nexus_filename,
    Title=container,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    # -------------------------------- #
    # Load Sample Container Background #
    # -------------------------------- #
    # NOTE: sample container background IS the empty instrument
    # The full formula
    # alpha_s(I_s - I_e) - alpha_c(I_c - I_e)
    # I = ----------------------------------------
    # alpha_v (I_v - I_v,e)
    #
    # alpha_s I_s - alpha_c I_c - alpha_e I_e
    # = -----------------------------------------
    # alpha_v I_v - alpha_v I_v,e
    # where
    # A_c - A_s
    # alpha_e = alpha_s - alpha_c = 1/A_s - 1/A_c = -------------
    # A_s * A_c
    if container_bg is not None:
    print("#-----------------------------------#")
    print("# Sample Container's Background")
    print("#-----------------------------------#")
    # NOTE: to make life easier
    # alpha_e I_e = alpha_s I_e - alpha_c I_e
    container_bg_fn = container_bg
    container_bg = load(
    'container_background',
    container_bg_fn,
    absorption_wksp=sam_abs_ws,
    **alignAndFocusArgs)
    tmp = load(
    'container_background',
    container_bg_fn,
    absorption_wksp=con_abs_ws,
    **alignAndFocusArgs)
    Minus(
    LHSWorkspace=container_bg,
    RHSWorkspace=tmp,
    OutputWorkspace=container_bg)
    save_banks(
    InputWorkspace=container_bg,
    Filename=nexus_filename,
    Title=container_bg,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

    The action of loading either sample or container is fundamentally based on the Mantid AlignAndFocusPowderFromFiles which is a wrapper script for the underlying AlignAndFocusPowder algorithm. Fundamentally, it is literally doing what its name says, i.e., align and focus the powder patterns coming from all detectors. This process involves the calibration and masking process -- the calibration and masking process is not involved in the mantidtotalscattering workflow. Instead, they were pre-configured using separate routine and the end result (DIFC values and mask pixel IDs) is stored in an H5 file which will then be read in by current mantidtotalscattering routine to further conduct the calibration and focusing step.

  5. Load in vanadium and its background (i.e., empty instrument) and repeat the same process as mentioned above in step-3 for sample and container,

    # Load Vanadium
    print("#-----------------------------------#")
    print("# Vanadium")
    print("#-----------------------------------#")
    van_wksp = load(
    'vanadium',
    van_scans,
    van_geometry,
    van_material,
    van_mass_density,
    van_abs_corr_ws,
    **alignAndFocusArgs)
    vanadium_title = "vanadium_and_background"
    save_banks(
    InputWorkspace=van_wksp,
    Filename=nexus_filename,
    Title=vanadium_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    van_material = mtd[van_wksp].sample().getMaterial()
    van_molecular_mass = van_material.relativeMolecularMass()
    nvan_atoms = getNumberAtoms(
    1.0,
    van_mass_density,
    van_molecular_mass,
    Geometry=van_geometry)
    print("Sample natoms:", natoms)
    print("Vanadium natoms:", nvan_atoms)
    print("Vanadium natoms / Sample natoms:", nvan_atoms / natoms)
    # ------------------------ #
    # Load Vanadium Background #
    # ------------------------ #
    # NOTE:
    # The full formula
    # alpha_s(I_s - I_e) - alpha_c(I_c - I_e)
    # I = ----------------------------------------
    # alpha_v (I_v - I_v,e)
    #
    # alpha_s I_s - alpha_c I_c - alpha_e I_e
    # = -----------------------------------------
    # alpha_v I_v - alpha_v I_v,e
    #
    # where
    # * I_v,e is vanadium background
    # * alpha_e = (alpha_s - alpha_c)
    #
    # ALSO, alpha is the inverse of [effective] absorption coefficient, i.e.
    # alpha = 1/A
    van_bg = None
    if van_bg_scans is not None:
    print("#-----------------------------------#")
    print("# Vanadium Background")
    print("#-----------------------------------#")
    # van_bg = alpha_v I_v,e
    van_bg = load(
    'vanadium_background',
    van_bg_scans,
    absorption_wksp=van_abs_corr_ws,
    **alignAndFocusArgs)
    vanadium_bg_title = "vanadium_background"
    save_banks(
    InputWorkspace=van_bg,
    Filename=nexus_filename,
    Title=vanadium_bg_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

  6. Background subtraction for both sample and vanadium,

    #################################################################
    # STEP 1: Subtract Backgrounds
    # van = van - van_bg
    # sam = sam - container
    # container = container - container_bg
    # and save (1) van (2) sam (3) container
    #################################################################
    sam_raw = 'sam_raw'
    CloneWorkspace(
    InputWorkspace=sam_wksp,
    OutputWorkspace=sam_raw) # for later
    container_raw = 'container_raw'
    CloneWorkspace(
    InputWorkspace=container,
    OutputWorkspace=container_raw) # for later
    if van_bg is not None:
    RebinToWorkspace(
    WorkspaceToRebin=van_bg,
    WorkspaceToMatch=van_wksp,
    OutputWorkspace=van_bg)
    Minus(
    LHSWorkspace=van_wksp,
    RHSWorkspace=van_bg,
    OutputWorkspace=van_wksp)
    RebinToWorkspace(
    WorkspaceToRebin=container,
    WorkspaceToMatch=sam_wksp,
    OutputWorkspace=container)
    Minus(
    LHSWorkspace=sam_wksp,
    RHSWorkspace=container,
    OutputWorkspace=sam_wksp)
    if container_bg is not None:
    RebinToWorkspace(
    WorkspaceToRebin=container_bg,
    WorkspaceToMatch=sam_wksp,
    OutputWorkspace=container_bg)
    Minus(
    LHSWorkspace=sam_wksp,
    RHSWorkspace=container_bg,
    OutputWorkspace=sam_wksp)
    for wksp in [container, van_wksp, sam_wksp]:
    ConvertUnits(
    InputWorkspace=wksp,
    OutputWorkspace=wksp,
    Target="MomentumTransfer",
    EMode="Elastic")
    container_title = "container_minus_back"
    vanadium_title = "vanadium_minus_back"
    sample_title = "sample_minus_back"
    save_banks(
    InputWorkspace=container,
    Filename=nexus_filename,
    Title=container_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    save_banks(
    InputWorkspace=van_wksp,
    Filename=nexus_filename,
    Title=vanadium_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    save_banks(
    InputWorkspace=sam_wksp,
    Filename=nexus_filename,
    Title=sample_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

  7. Save intermediate files, prepare for alternative ways for doing the absorption and multiple scattering corrections for vanadium. This part of codes needs to be refurbished to boost the performance of the workflow execution,

    #################################################################
    # STEP 2.0: Prepare vanadium as normalization calibration
    # Multiple-Scattering and Absorption (Steps 2-4) for Vanadium
    # Vanadium peak strip, smooth, Plazech
    #################################################################
    van_corrected = 'van_corrected'
    ConvertUnits(
    InputWorkspace=van_wksp,
    OutputWorkspace=van_corrected,
    Target="Wavelength",
    EMode="Elastic")
    if "Type" in van_abs_corr:
    if van_abs_corr['Type'] == 'Carpenter' \
    or van_ms_corr['Type'] == 'Carpenter':
    CarpenterSampleCorrection(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    CylinderSampleRadius=van['Geometry']['Radius'])
    elif van_abs_corr['Type'] == 'Mayers' \
    or van_ms_corr['Type'] == 'Mayers':
    if van_ms_corr['Type'] == 'Mayers':
    MayersSampleCorrection(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    MultipleScattering=True)
    else:
    MayersSampleCorrection(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    MultipleScattering=False)
    else:
    pass
    else:
    CloneWorkspace(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected)
    ConvertUnits(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    Target='MomentumTransfer',
    EMode='Elastic')
    vanadium_title += "_ms_abs_corrected"
    save_banks(
    InputWorkspace=van_corrected,
    Filename=nexus_filename,
    Title=vanadium_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    save_banks(
    InputWorkspace=van_corrected,
    Filename=nexus_filename,
    Title=vanadium_title + "_with_peaks",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

    Questions to address,

    1. Whether we really need to keep those intermediate files. If not, we can save a lot of computation time spent on transformation in between different spaces.
    1. The two unit conversion actions currently outside the if statement for alternative absorption correction routines need to be moved inside the if statement.
  8. Strip vanadium diffraction peaks and smooth the pattern, in preparation for next step of normalization,

    # Smooth Vanadium (strip peaks plus smooth)
    ConvertUnits(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    Target='dSpacing',
    EMode='Elastic')
    # After StripVanadiumPeaks, the workspace goes from EventWorkspace ->
    # Workspace2D
    StripVanadiumPeaks(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    BackgroundType='Quadratic')
    ConvertUnits(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    Target='MomentumTransfer',
    EMode='Elastic')
    vanadium_title += '_peaks_stripped'
    save_banks(
    InputWorkspace=van_corrected,
    Filename=nexus_filename,
    Title=vanadium_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    ConvertUnits(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    Target='TOF',
    EMode='Elastic')
    FFTSmooth(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    Filter="Butterworth",
    Params='20,2',
    IgnoreXBins=True,
    AllSpectra=True)
    ConvertUnits(
    InputWorkspace=van_corrected,
    OutputWorkspace=van_corrected,
    Target='MomentumTransfer',
    EMode='Elastic')
    vanadium_title += '_smoothed'
    save_banks(
    InputWorkspace=van_corrected,
    Filename=nexus_filename,
    Title=vanadium_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

  9. Vanadium inelastic correction using Placzek method, to the 1st or the 2nd order,

    # Inelastic correction
    if van_inelastic_corr['Type'] == "Placzek":
    van_scan = van['Runs'][0]
    van_incident_wksp = 'van_incident_wksp'
    van_inelastic_opts = SetInelasticCorrection(
    van.get('InelasticCorrection', None))
    lambda_binning_fit = van_inelastic_opts['LambdaBinningForFit']
    lambda_binning_calc = van_inelastic_opts['LambdaBinningForCalc']
    print('van_scan:', van_scan)
    monitor_wksp = 'monitor'
    eff_corr_wksp = 'monitor_efficiency_correction_wksp'
    LoadNexusMonitors(Filename=facility_file_format % (instr, van_scan),
    OutputWorkspace=monitor_wksp)
    ConvertUnits(InputWorkspace=monitor_wksp, OutputWorkspace=monitor_wksp,
    Target='Wavelength', EMode='Elastic')
    Rebin(InputWorkspace=monitor_wksp,
    OutputWorkspace=monitor_wksp,
    Params=lambda_binning_fit,
    PreserveEvents=False)
    ConvertToPointData(InputWorkspace=monitor_wksp,
    OutputWorkspace=monitor_wksp)
    CalculateEfficiencyCorrection(WavelengthRange=lambda_binning_fit,
    ChemicalFormula="(He3)",
    DensityType="Number Density",
    Density=1.93138101e-08,
    Thickness=.1,
    OutputWorkspace=eff_corr_wksp)
    Multiply(
    LHSWorkspace=monitor_wksp,
    RHSWorkspace=eff_corr_wksp,
    OutputWorkspace=van_incident_wksp)
    mtd[van_incident_wksp].setDistribution(True)
    fit_type = van['InelasticCorrection']['FitSpectrumWith']
    FitIncidentSpectrum(
    InputWorkspace=van_incident_wksp,
    OutputWorkspace=van_incident_wksp,
    FitSpectrumWith=fit_type,
    BinningForFit=lambda_binning_fit,
    BinningForCalc=lambda_binning_calc,
    PlotDiagnostics=False)
    van_placzek = 'van_placzek'
    SetSample(
    InputWorkspace=van_incident_wksp,
    Material=van_mat_dict)
    calc_interfere = van['InelasticCorrection']['Interference']
    if calc_interfere:
    van_t = float(van['InelasticCorrection']['SampleTemperature'])
    van_pl_order = 2
    else:
    van_t = None
    van_pl_order = 1
    CalculatePlaczek(
    InputWorkspace=van_corrected,
    IncidentSpectra=van_incident_wksp,
    OutputWorkspace=van_placzek,
    LambdaD=1.44,
    Order=van_pl_order,
    ScaleByPackingFraction=False,
    SampleTemperature=van_t)
    ConvertToHistogram(
    InputWorkspace=van_placzek,
    OutputWorkspace=van_placzek)
    if mtd[van_corrected].YUnit() == "":
    mtd[van_corrected].setYUnit("Counts")
    mtd[van_placzek].setYUnit("Counts")
    if not mtd[van_placzek].isDistribution():
    ConvertToDistribution(van_placzek)
    bin_tmp = float(lambda_binning_calc.split(',')[1])
    mtd[van_placzek] = mtd[van_placzek] * bin_tmp
    # Rebin and save in Q
    for wksp in [van_placzek, van_corrected]:
    ConvertUnits(
    InputWorkspace=wksp,
    OutputWorkspace=wksp,
    Target='MomentumTransfer',
    EMode='Elastic')
    Rebin(
    InputWorkspace=wksp,
    OutputWorkspace=wksp,
    Params=binning,
    PreserveEvents=True)
    save_banks(
    InputWorkspace=van_placzek,
    Filename=nexus_filename,
    Title="vanadium_placzek",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

    Notes

    1. The Placzek method may not be generally working for many systems, so indeed we need to think about alternative ways to cope with the inelastic scattering properly. This is important especially for magnetic diffuse scattering for which the dominating signal is mainly located in the low-Q region and it overlaps strongly with the region where the inelastic scattering effect is strongest.
    1. The application of Placzek type of correction involves the figuring out of incident flux and detector efficiency. The incident flux was estimated by looking at the monitor spectrum and correct it for monitor efficiency, which was then followed by a pattern fitting to obtain the incident flux function. The detector efficiency was approximated following the formula mentioned in the following paper,

    Howe, McGreevy, and Howells, J., (1989), The analysis of liquid structure data from time-of-flight neutron diffractometry,Journal of Physics: Condensed Matter, Volume 1, Issue 22, pp. 3433-3451, doi: 10.1088/0953-8984/1/22/005.

  10. Normalization over vanadium,

    #################################################################
    # STEP 2.1: Normalize by Vanadium and
    # convert to Q (momentum transfer)
    # For sample, container, sample raw, vanadium background
    #################################################################
    wksp_list = [sam_wksp, sam_raw, van_corrected]
    for name in wksp_list:
    ConvertUnits(
    InputWorkspace=name,
    OutputWorkspace=name,
    Target='MomentumTransfer',
    EMode='Elastic',
    ConvertFromPointData=False)
    Rebin(
    InputWorkspace=name,
    OutputWorkspace=name,
    Params=binning,
    PreserveEvents=True)
    # Save (sample - back) / van_corrected
    Divide(
    LHSWorkspace=sam_wksp,
    RHSWorkspace=van_corrected,
    OutputWorkspace=sam_wksp)
    sample_title += "_normalized"
    save_banks(
    InputWorkspace=sam_wksp,
    Filename=nexus_filename,
    Title=sample_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    # Save the sample / normalized (ie no background subtraction)
    Divide(
    LHSWorkspace=sam_raw,
    RHSWorkspace=van_corrected,
    OutputWorkspace=sam_raw)
    save_banks(
    InputWorkspace=sam_raw,
    Filename=nexus_filename,
    Title="sample_normalized",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    # Output an initial I(Q) for sample
    iq_filename = title + '_initial_iofq_banks.nxs'
    try:
    os.remove(os.path.join(OutputDir, iq_filename))
    msg = "Old NeXus file found for initial iofq. Will delete it."
    msg1 = "Old NeXus file: {}"
    log.notice(msg)
    log.notice(msg1.format(os.path.join(OutputDir, iq_filename)))
    except OSError:
    msg = "Old NeXus file not found for initial iofq. Moving forward."
    log.notice(msg)
    pass
    save_banks(
    InputWorkspace=sam_wksp,
    Filename=iq_filename,
    Title="IQ_banks",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    wksp_list = [container, container_raw, van_corrected]
    if container_bg is not None:
    wksp_list.append(container_bg)
    if van_bg is not None:
    wksp_list.append(van_bg)
    for name in wksp_list:
    ConvertUnits(
    InputWorkspace=name,
    OutputWorkspace=name,
    Target='MomentumTransfer',
    EMode='Elastic',
    ConvertFromPointData=False)
    Rebin(
    InputWorkspace=name,
    OutputWorkspace=name,
    Params=binning,
    PreserveEvents=True)
    # Save the container - container_background / normalized
    Divide(
    LHSWorkspace=container,
    RHSWorkspace=van_corrected,
    OutputWorkspace=container)
    container_title += '_normalized'
    save_banks(
    InputWorkspace=container,
    Filename=nexus_filename,
    Title=container_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    # Save the container / normalized (ie no background subtraction)
    Divide(
    LHSWorkspace=container_raw,
    RHSWorkspace=van_corrected,
    OutputWorkspace=container_raw)
    save_banks(
    InputWorkspace=container_raw,
    Filename=nexus_filename,
    Title="container_normalized",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    # Save the container_background / normalized
    if container_bg is not None:
    Divide(
    LHSWorkspace=container_bg,
    RHSWorkspace=van_corrected,
    OutputWorkspace=container_bg)
    container_bg_title = "container_back_normalized"
    save_banks(
    InputWorkspace=container_bg,
    Filename=nexus_filename,
    Title=container_bg_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    # Save the vanadium_background / normalized
    if van_bg is not None:
    Divide(
    LHSWorkspace=van_bg,
    RHSWorkspace=van_corrected,
    OutputWorkspace=van_bg)
    vanadium_bg_title += "_normalized"
    save_banks(
    InputWorkspace=van_bg,
    Filename=nexus_filename,
    Title=vanadium_bg_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

    Again, this part of codes contains some potentially redundant computation and saving files efforts.

  11. Alternative way of doing absorption and multiple scattering corrections for sample,

    #################################################################
    # STEP 3 & 4: Subtract multiple scattering and apply absorption correction
    #################################################################
    ConvertUnits(
    InputWorkspace=sam_wksp,
    OutputWorkspace=sam_wksp,
    Target="Wavelength",
    EMode="Elastic")
    sam_corrected = 'sam_corrected'
    if sam_abs_corr and sam_ms_corr:
    # When full PP approach is used for absorption correction, the
    # calculation was performed when loading data at the stage of
    # align and focus. Multiple scattering calculation was also
    # embedded there.
    if sam_abs_corr['Type'] == 'Carpenter' \
    or sam_ms_corr['Type'] == 'Carpenter':
    CarpenterSampleCorrection(
    InputWorkspace=sam_wksp,
    OutputWorkspace=sam_corrected,
    CylinderSampleRadius=sample['Geometry']['Radius'])
    elif sam_abs_corr['Type'] == 'Mayers' \
    or sam_ms_corr['Type'] == 'Mayers':
    if sam_ms_corr['Type'] == 'Mayers':
    MayersSampleCorrection(
    InputWorkspace=sam_wksp,
    OutputWorkspace=sam_corrected,
    MultipleScattering=True)
    else:
    MayersSampleCorrection(
    InputWorkspace=sam_wksp,
    OutputWorkspace=sam_corrected,
    MultipleScattering=False)
    else:
    CloneWorkspace(
    InputWorkspace=sam_wksp,
    OutputWorkspace=sam_corrected)
    ConvertUnits(
    InputWorkspace=sam_corrected,
    OutputWorkspace=sam_corrected,
    Target='MomentumTransfer',
    EMode='Elastic')
    sample_title += "_ms_abs_corrected"
    save_banks(
    InputWorkspace=sam_corrected,
    Filename=nexus_filename,
    Title=sample_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    else:
    CloneWorkspace(InputWorkspace=sam_wksp, OutputWorkspace=sam_corrected)

  12. Normalization over total number of atoms,

    #################################################################
    # STEP 5: Divide by number of atoms in sample
    #################################################################
    mtd[sam_corrected] = (nvan_atoms / natoms) * mtd[sam_corrected]
    ConvertUnits(
    InputWorkspace=sam_corrected,
    OutputWorkspace=sam_corrected,
    Target='MomentumTransfer',
    EMode='Elastic')
    sample_title += "_norm_by_atoms"
    save_banks(
    InputWorkspace=sam_corrected,
    Filename=nexus_filename,
    Title=sample_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

  13. Wrapping up final factors to be applied,

    #################################################################
    # STEP 6: Divide by total scattering length squared = total scattering
    # cross-section over 4 * pi
    #################################################################
    van_material = mtd[van_corrected].sample().getMaterial()
    sigma_v = van_material.totalScatterXSection()
    prefactor = (sigma_v / (4. * np.pi))
    msg = "Total scattering cross-section of Vanadium:{} sigma_v / 4*pi: {}"
    print(msg.format(sigma_v, prefactor))
    if van_inelastic_corr['Type'] == "Placzek":
    mtd[sam_corrected] = (prefactor + mtd[van_placzek]) * mtd[sam_corrected]
    else:
    mtd[sam_corrected] = prefactor * mtd[sam_corrected]
    sample_title += '_multiply_by_vanSelfScat'
    save_banks(
    InputWorkspace=sam_corrected,
    Filename=nexus_filename,
    Title=sample_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

    Refer to the following paper for detailed formulation,

    P. Peterson, et al. J. Appl. Cryst. (2021). 54, 317–332.

  14. Inelastic scattering correction for sample -- see the notes in step-8 above,

    #################################################################
    # STEP 7: Inelastic correction
    #################################################################
    if sam_inelastic_corr['Type'] == "Placzek":
    if sam_material is None:
    error = "For Placzek correction, must specifiy a sample material."
    raise Exception(error)
    sam_scan = sample['Runs'][0]
    sam_incident_wksp = 'sam_incident_wksp'
    sam_inelastic_opts = SetInelasticCorrection(
    sample.get('InelasticCorrection', None))
    lambda_binning_fit = sam_inelastic_opts['LambdaBinningForFit']
    lambda_binning_calc = sam_inelastic_opts['LambdaBinningForCalc']
    monitor_wksp = 'monitor'
    eff_corr_wksp = 'monitor_efficiency_correction_wksp'
    LoadNexusMonitors(Filename=facility_file_format % (instr, sam_scan),
    OutputWorkspace=monitor_wksp)
    ConvertUnits(InputWorkspace=monitor_wksp, OutputWorkspace=monitor_wksp,
    Target='Wavelength', EMode='Elastic')
    Rebin(InputWorkspace=monitor_wksp,
    OutputWorkspace=monitor_wksp,
    Params=lambda_binning_fit,
    PreserveEvents=False)
    ConvertToPointData(InputWorkspace=monitor_wksp,
    OutputWorkspace=monitor_wksp)
    CalculateEfficiencyCorrection(WavelengthRange=lambda_binning_fit,
    ChemicalFormula="(He3)",
    DensityType="Number Density",
    Density=1.93138101e-08,
    Thickness=.1,
    OutputWorkspace=eff_corr_wksp)
    Multiply(
    LHSWorkspace=monitor_wksp,
    RHSWorkspace=eff_corr_wksp,
    OutputWorkspace=sam_incident_wksp)
    mtd[sam_incident_wksp].setDistribution(True)
    fit_type = sample['InelasticCorrection']['FitSpectrumWith']
    FitIncidentSpectrum(
    InputWorkspace=sam_incident_wksp,
    OutputWorkspace=sam_incident_wksp,
    FitSpectrumWith=fit_type,
    BinningForFit=lambda_binning_fit,
    BinningForCalc=lambda_binning_calc)
    sam_placzek = 'sam_placzek'
    SetSample(
    InputWorkspace=sam_incident_wksp,
    Material=sam_mat_dict)
    calc_interfere = sample['InelasticCorrection']['Interference']
    if calc_interfere:
    sample_t = float(sample['InelasticCorrection']['SampleTemperature'])
    sample_pl_order = 2
    else:
    sample_t = None
    sample_pl_order = 1
    CalculatePlaczek(
    InputWorkspace=sam_corrected,
    IncidentSpectra=sam_incident_wksp,
    OutputWorkspace=sam_placzek,
    LambdaD=1.44,
    Order=sample_pl_order,
    ScaleByPackingFraction=False,
    SampleTemperature=sample_t)
    ConvertToHistogram(
    InputWorkspace=sam_placzek,
    OutputWorkspace=sam_placzek)
    if mtd[sam_corrected].YUnit() == "":
    mtd[sam_corrected].setYUnit("Counts")
    mtd[sam_placzek].setYUnit("Counts")
    if not mtd[sam_placzek].isDistribution():
    ConvertToDistribution(sam_placzek)
    bin_tmp = float(lambda_binning_calc.split(',')[1])
    mtd[sam_placzek] = mtd[sam_placzek] * bin_tmp
    ConvertUnits(
    InputWorkspace=sam_placzek,
    OutputWorkspace=sam_placzek,
    Target='MomentumTransfer',
    EMode='Elastic')
    Rebin(
    InputWorkspace=sam_placzek,
    OutputWorkspace=sam_placzek,
    Params=binning,
    PreserveEvents=True)
    save_banks(
    InputWorkspace=sam_placzek,
    Filename=nexus_filename,
    Title="sample_placzek",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    Minus(
    LHSWorkspace=sam_corrected,
    RHSWorkspace=sam_placzek,
    OutputWorkspace=sam_corrected)
    sample_title += '_placzek_corrected'
    save_banks(
    InputWorkspace=sam_corrected,
    Filename=nexus_filename,
    Title=sample_title,
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

  15. Calculate normalized S(Q), fit the high-Q level and apply an extra scale factor so that high-Q level should approach 1,

    #################################################################
    # STEP 8: S(Q) and F(Q), bank-by-bank
    # processing includes
    # 1. to absolute scale
    # 2. save to S(Q)
    # 3. self scattering correction and save S(Q) corrected
    # 4. save to F(Q)
    #################################################################
    sam_corrected_norm = sam_corrected + '_norm'
    to_absolute_scale(sam_corrected, sam_corrected_norm)
    save_banks(
    InputWorkspace=sam_corrected_norm,
    Filename=nexus_filename,
    Title="SQ_banks_normalized",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    if self_scattering_level_correction:
    sam_corrected_norm_scaled = sam_corrected_norm + '_scaled'
    _, bad_fitted_levels = \
    calculate_and_apply_fitted_levels(sam_corrected_norm,
    self_scattering_level_correction,
    sam_corrected_norm_scaled)
    if bad_fitted_levels:
    for bank, scale in bad_fitted_levels.items():
    final_message +=\
    f'Bank {bank} potentially has a tilted baseline with ' \
    f'the fitted scale = {scale} ' \
    f'and was not scaled in {sam_corrected_norm_scaled}\n'
    save_banks(
    InputWorkspace=sam_corrected_norm_scaled,
    Filename=nexus_filename,
    Title="SQ_banks_normalized_scaled",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)
    else:
    sam_corrected_norm_scaled = sam_corrected_norm # just an alias
    fq_banks = 'FQ_banks'
    to_f_of_q(sam_corrected_norm_scaled, fq_banks)
    save_banks(
    InputWorkspace=fq_banks,
    Filename=nexus_filename,
    Title="FQ_banks",
    OutputDir=OutputDir,
    GroupingWorkspace=grp_wksp,
    Binning=binning)

  16. Save Bragg pattern, given physical banks,

    #################################################################
    # STEP 8.1 Output Bragg Diffraction
    # convert to TOF and save to GSAS
    #################################################################
    ConvertUnits(
    InputWorkspace=sam_corrected,
    OutputWorkspace=sam_corrected,
    Target="TOF",
    EMode="Elastic")
    ConvertToHistogram(
    InputWorkspace=sam_corrected,
    OutputWorkspace=sam_corrected)
    xmin, xmax = get_each_spectra_xmin_xmax(mtd[sam_corrected])
    CropWorkspaceRagged(
    InputWorkspace=sam_corrected,
    OutputWorkspace=sam_corrected,
    Xmin=xmin,
    Xmax=xmax)
    xmin_rebin = min(xmin)
    if "TMin" in alignAndFocusArgs.keys():
    tmin = alignAndFocusArgs["TMin"]
    info_part1 = f"[Info] 'TMin = {tmin}' found in the input config file."
    info_part2 = " Will use it for Bragg output."
    print(info_part1 + info_part2)
    xmin_rebin = tmin
    xmax_rebin = max(xmax)
    # Note: For the moment, bin size for Bragg output is hard coded.
    # May need to make it user input if necessary.
    tof_binning = "{xmin},-0.0008,{xmax}".format(xmin=xmin_rebin,
    xmax=xmax_rebin)
    Rebin(
    InputWorkspace=sam_corrected,
    OutputWorkspace=sam_corrected,
    Params=tof_binning)
    SaveGSS(
    InputWorkspace=sam_corrected,
    Filename=os.path.join(os.path.abspath(OutputDir), title+".gsa"),
    SplitFiles=False,
    Append=False,
    MultiplyByBinWidth=True,
    Format="SLOG",
    ExtendedHeader=True)
    SaveFocusedXYE(
    InputWorkspace=sam_corrected,
    Filename=os.path.join(os.path.abspath(OutputDir), title+".xye"))

    At this point, it can be seen that with the mantidtotalscattering workflow, the reduction for total scattering and Bragg patterns goes through identical procedures.

@Kvieta1990 Kvieta1990 self-assigned this May 30, 2022
@Kvieta1990 Kvieta1990 added Type: Docs Updates or fixes to documentation Priority: High High priority feature / fix labels May 30, 2022
@Kvieta1990 Kvieta1990 pinned this issue Jun 1, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Priority: High High priority feature / fix Type: Docs Updates or fixes to documentation
Projects
None yet
Development

No branches or pull requests

1 participant