;+ ; NAME: ; regiondna_conf_load_data.pro ; ; PURPOSE: ; This function loads climate data for use in ipccar6_conf.pro.. ; ; CATEGORY: ; REGIONAL D-AND-A CONFIDENCE ; ; CALLING SEQUENCE: ; result = regiondna_conf_load_data( file_list, period=period, $ ; season_id=season_id, season_len=season_len, var_label=var_label ) ; ; INPUTS: ; FILE_LIST: A required string vector specifying the full file names of the ; files to load. Each of the N_SET values contains the "&"-delimited ; file list for each of the N_SET sets (e.g. different simulations of ; the same model under the same forcing scenario). File lists can ; contain the "*" wildcard. ; ANOMALY, GUESS_FILE_RESTRICT, MASK_FILE, MASK_OPTIONS, MASK_VAR_LABEL, ; PERIOD, PRETEND_PERIOD, REGION_DEF, REGION_LAT, REGION_LON, REGION_MASK, ; SEASON_ID, SEASON_LON, TIME_SEG_LEN, VAR_LABEL ; ; KEYWORD PARAMETERS: ; ANOMALY: An optional scalar string specifying the manner in which to ; calculate anomalies. Supported values are: 'absolute', ; 'fractional'. The default is variable-specific (see code). ; GUESS_FILE_RESTRICT: If set then the function guesses what subset of file ; in the file list for each set are to actually be read, based on dates ; in file name. Requires that the file follows a supported file naming ; convention, such as used in the CMIP6 archive. This is useful for ; efficient reading of segments of long noise simulations. ; MASK_FILE: A string scalar specifying the full file name of the NetCDF ; file containing the region mask data. Required if REGION_PREPROCESSED ; is not set or REGION_MASK is not input. ; MASK_OPTIONS: An optional string scalar specifying if the mask data ; should be converted to 'land', 'ocean', or 'no mask' (i.e. just get ; the spatial grid) format. ; MASK_VAR_LABEL: A string scalar specifying the label of the region mask ; variable in the MASK_FILE file. Required if MASK_FILE is input. ; PERIOD: An optional integer 2-element vector specifying the start and end ; years of the time period to load. For instance PERIOD=[1991,2010] ; loads the 20 years between and including 1991 and 2010. ; PRETEND_PERIOD: An optional 2-element vector integer following the PERIOD ; format. Instead of loading an actually time period to read from file ; this effectively specifies the length of the time period to load, such ; that multiple sequential periods can be loaded. This is used to ; extract sequential segments from noise simulations. ; REGION_DEF: A string vector of length N_REGION defining the regions. See ; regiondna_region_mask for supported definition formats. If ; REGION_PREPROCESSED is set, then the format should be ; "=", where "" is the modifier to the realm ; value in the NetCDF file name and global attribute and "" is ; the region identifier in the regionid NetCDF variable. For example, ; for the WRAF2-v4.1 region 6.1.1 (DR Congo) use 'WRAF2-v4-1=6.1.1'. If ; REGION_PROPROCESSED is set then the value for all regions must ; be the same, and the data must be in the same file for all regions. ; REGION_ID: Returns an N_REGION string vector specifying the identifiers ; of the regions to be included in RESULT. ; REGION_LAT: A float vector of length N_REGION_LAT specifying the latitude ; dimension of REGION_MASK. Required if REGION_MASK is set. ; REGION_LON: A float vector of length N_REGION_LON specifying the ; longitude dimension of REGION_MASK. Required if REGION_MASK is set. ; REGION_MASK: A float array of size N_REGION_LON,N_REGION_LAT,N_REGION ; defining the regional masks to apply to the data for each of the ; N_REGION regions. Values should be in the [0,1] range. Required if ; REGION_DEF is not input, otherwise ignored. ; REGION_PREPROCESSED: If set, then the regional averages are assumed to ; have been already calculated in the input NetCDF files. The default ; is to assume gridded data for which this code needs to do the regional ; extraction and averaging. ; REGION_SAVE: If set then processed data is saved to file. ; REGION_REALM: An optional scalar string specifying a label for the type ; of regions. For instance, for the WRAF05-v4-1 regions this might be ; 'WRAF05-v4-1'. Used for generating file names for processed data. ; SEASON_ID: A required scalar integer specifying the index of the middle ; (or first middle) month of the season to output. See ; month_to_seasons.pro for details. Note SEASON_ID=5, SEASON_LEN=12 ; returns January-December annual means. ; SEASON_LEN: A required scalar integer specifying the number of months in ; the season to be extracted. See months_to_seasons.pro for details. ; TIME_SEG_LEN: An optional scalar integer specifying the number of years ; to average together into the output. For instance TIME_SEG_LEN=10 ; produces decadal averages. The default is 1 (i.e. no averaging). ; VAR_LABEL: A required scalar string specifying the label of the climate ; variable to load. e.g. 'pr' for precipitation. ; ; OUTPUT: ; RESULT: Returns an N_TIME,N_SET,N_REGION array of the regionally-averaged ; and time-step-averaged data for each region. ; REGION_ID, REGION_LAT, REGION_LON, REGION_MASK ; ; USES: ; add_dim.pro ; convert_time_format.pro ; regiondna_region_mask.pro ; mask_lonlattime.pro ; months_to_seasons.pro ; netcdf_read_geo.pro ; netcdf_read_geo_multitime.pro ; netcdf_write.pro ; process_lonlatmonth.pro ; str.pro ; string_substitute.pro ; ; PROCEDURE: ; This function reads data from NetCDF files. ; ; EXAMPLE: ; See regiondna_conf.pro. ; ; MODIFICATION HISTORY: ; Written by: Daithi A. Stone (dastone@runbox.com), 2014-12-23 (As ; part of hanseng_conf_assess.pro) ; Modified: DAS, 2020-10-29 (Extracted and adapted from ; hanseng_conf_assess.pro) ; Modified: DAS, 2020-11-26 (Added directory change for some CMIP6 output ; due to write permissions) ; Modified: DAS, 2020-12-04 (Added REGION_ID, REGION_REALM, and REGION_SAVE ; keyword inputs and option; Removed requirement for season averaging; ; Removed automatic anomaly setting) ; Modified: DAS, 2020-12-09 (Corrected issue with time units) ; Modified: DAS, 2021-04-08 (Increased speed for multi-section noise ; simulations) ; Modified: DAS, 2021-12-23 (Completed documentation) ;- ;*********************************************************************** FUNCTION REGIONDNA_CONF_LOAD_DATA, $ FILE_LIST, $ ANOMALY=anomaly, $ GUESS_FILE_RESTRICT=guess_file_restrict_opt, $ MASK_FILE=mask_file, MASK_OPTIONS=mask_options, $ MASK_VAR_LABEL=mask_var_label, $ PERIOD=period, TIME_SEG_LEN=time_seg_len, $ PRETEND_PERIOD=period_pretend, $ REGION_DEF=region_def, REGION_ID=region_id, $ REGION_REALM=region_realm, $ REGION_MASK=region_data, REGION_LON=region_lon, REGION_LAT=region_lat, $ REGION_PREPROCESSED=region_preprocessed_opt, REGION_SAVE=region_save_opt, $ SEASON_ID=season_id, SEASON_LEN=season_len, $ VAR_LABEL=var_label ;*********************************************************************** ; Constants and options ; Determine the number of data sets to load n_set = n_elements( file_list ) if n_set eq 0 then stop ; Ensure variable label has been input if not( keyword_set( var_label ) ) then stop ; Ensure period has been specified if max( n_elements( period ) eq [ 0, 2 ] ) ne 1 then stop ;; Ensure season has been specified ;if ( n_elements( season_id ) ne 1 ) or not( keyword_set( season_len ) ) $ ; then stop ; The default time segment length (one year) if not( keyword_set( time_seg_len ) ) then time_seg_len = 1 ; Determine the number of years and time segments if keyword_set( period ) then begin n_year = period[1] - period[0] + 1 endif else if keyword_set( period_pretend ) then begin n_year = period_pretend[1] - period_pretend[0] + 1 endif else begin if not( keyword_set( region_save_opt ) ) then stop endelse if keyword_set( n_year ) then begin if n_year mod time_seg_len ne 0 then stop n_time = n_year / time_seg_len endif ; The default anomaly if keyword_set( anomaly ) then begin if var_type( anomaly ) ne 7 then begin if var_label eq 'pr' then begin anomaly = 'fractional' endif else if max( var_label eq [ 'tas', 'tos' ] ) eq 1 then begin anomaly = 'absolute' endif else begin ; Not yet supported stop endelse endif endif ; The number of regions n_region = n_elements( region_def ) if n_region eq 0 then begin if not( keyword_set( region_mask ) ) or not( keyword_set( region_lon ) ) $ or not( keyword_set( region_lat ) ) then stop n_region = n_elements( region_mask[0,0,*] ) endif ; The option to load already region-masked and -averaged data region_preprocessed_opt = keyword_set( region_preprocessed_opt ) ; Determine region details for pre-calculated regional data if ( region_preprocessed_opt eq 1 ) and not( keyword_set( region_realm ) ) $ then begin if not( keyword_set( region_def ) ) then stop if max( strpos( region_def, '&' ) ) ge 0 then stop if max( strpos( region_def, ',' ) ) ge 0 then stop region_id = region_def for i_region = 0, n_region - 1 do begin temp = strsplit( region_def[i_region], '=', extract=1, count=n_temp ) if n_temp ne 2 then stop if i_region eq 0 then begin region_realm = temp[0] endif else begin if temp[0] ne region_realm then stop endelse region_id[i_region] = temp[1] endfor endif ; The option to save region-masked and -averaged data region_save_opt = keyword_set( region_save_opt ) if region_save_opt eq 1 then begin if not( keyword_set( region_realm ) ) then stop endif ; Option to guess restriction of file list based on dates in file name ; (useful for efficient reading of segments of long noise simulations) guess_file_restrict_opt = keyword_set( guess_file_restrict_opt ) ;*********************************************************************** ; Load data ; Get the grid and land-sea mask for this source, if region masking needs to be ; done if region_preprocessed_opt eq 0 then begin if keyword_set( mask_file ) then begin mask_data = netcdf_read_geo( mask_file, mask_var_label, lon=lon_data, $ lat=lat_data, quiet=1 ) if mask_var_label eq 'sftlf' then mask_data = reform( mask_data ) / 100. n_lon = n_elements( lon_data ) n_lat = n_elements( lat_data ) endif else begin temp_mask_file = strtrim( strsplit( file_list[0], '&,', extract=1 ), 2 ) temp_mask_file = file_search( temp_mask_file, count=n_temp_file ) if n_temp_file eq 0 then stop temp_mask_file = temp_mask_file[0] mask_data = netcdf_read_geo( temp_mask_file, '', lon=lon_data, $ lat=lat_data, quiet=1 ) n_lon = n_elements( lon_data ) n_lat = n_elements( lat_data ) mask_data = 1. + fltarr( n_lon, n_lat ) endelse ; Reproduce the mask for each region (the region mask will be applied later) if n_region gt 1 then begin mask_data = add_dim( mask_data, 2, n_region ) endif endif ; Convert to requested mask type if region_preprocessed_opt eq 0 then begin if keyword_set( mask_var_data ) and keyword_set( mask_options ) then begin if ( mask_var_label eq 'sftlf' ) $ and ( max( mask_options eq 'land' ) eq 1 ) then begin ; Nothing to do endif else if ( mask_var_label eq 'sftlf' ) $ and ( max( mask_options eq 'ocean' ) eq 1 ) then begin ; Reverse to ocean mask mask_data = 1. - mask_data endif else if max( mask_options eq 'no mask' ) then begin ; Include everything mask_data[*,*,*] = 1. endif else begin ; Not yet supported endelse endif endif ; Load region mask if region_preprocessed_opt eq 0 then begin if keyword_set( region_def ) then begin if not( keyword_set( temp_mask_file ) ) then temp_mask_file = mask_file mask_var_region = fltarr( n_lon, n_lat, n_region ) for i_region = 0, n_region - 1 do begin mask_var_region[*,*,i_region] = regiondna_region_mask( lon=lon_data, $ lat=lat_data, region=region_def[i_region], mask_file=temp_mask_file ) endfor ; Or interpolate inputed region mask endif else if keyword_set( region_mask ) then begin if not( keyword_set( region_lon ) ) then stop if not( keyword_set( region_lat ) ) then stop mask_var_region = fltarr( n_lon, n_lat, n_region ) for i_region = 0, n_region - 1 do begin temp_lon = region_lon temp_lat = region_lat temp_mask = region_mask[*,*,i_region] mask_lonlattime, temp_mask, lon=temp_lon, lat=temp_lat, $ mask_lon=lon_data, mask_lat=lat_data mask_var_region[*,*,i_region] = temp_mask endfor endif else begin stop endelse ; Add region weighting to mask mask_data = mask_data * mask_var_region ; Add time dimension to mask if region_save_opt eq 0 then mask_data = add_dim( mask_data, 2, n_year * 12 ) endif ; Initialise data array if region_save_opt eq 0 then begin var_data = !values.f_nan * fltarr( n_time, n_set, n_region ) endif ; Initialise set counter ctr_set = 0 ; Iterate through sets for i_set = 0, n_set - 1 do begin ; Extract file list temp_file = strtrim( strsplit( file_list[i_set], ',', extract=1 ), 2 ) ; Modify file list for data that has already had the regional processing done if region_preprocessed_opt eq 1 then begin ; Modify the file list temp_file = string_substitute( temp_file, '/atmos/', $ '/atmos-'+region_realm+'/', regex=1 ) temp_file = string_substitute( temp_file, '_Amon_', $ '_Amon-'+region_realm+'_', regex=1 ) temp_file = string_substitute( temp_file, 'Amon/', $ 'Amon-'+region_realm+'/', regex=1 ) temp_label_in_lon = 'regionid,region' ; A directory change for CMIP6 (because of write permissions) spawn, 'echo $HOSTNAME', temp if strpos( temp, 'maui.niwa.co.nz' ) ge 0 then begin temp_file = string_substitute( temp_file, '/CMIP6/', '/CMIP6-fixed/', $ regex=1 ) endif endif ; Determine time period covered by this simulation, if appropriate if keyword_set( period_pretend ) then begin ; Get time dimension for this simulation temp = netcdf_read_geo_multitime( temp_file, '', quiet=1, $ time=temp_time_data, units_time=temp_time_units, $ calendar=temp_time_calendar, label_in_lon='nolon', $ label_in_lat='nolat', no_lat_sort=1 ) n_temp_time_data = n_elements( temp_time_data ) temp_time_data = convert_time_format( temp_time_data, temp_time_units, $ 'yyyymm', calendar=temp_time_calendar ) ; Determine appropriate periods (aligning with last December, not first ; January) id_start = min( where( strmid( temp_time_data, 4, 2 ) eq '01', $ n_id_start ) ) if n_id_start eq 0 then stop n_sim = ( n_temp_time_data - id_start ) / 12 / n_year id_end = max( where( strmid( temp_time_data, 4, 2 ) eq '12', n_id_end ) ) if n_id_end eq 0 then stop if id_end lt id_start then stop id_start = id_start + ( id_end - id_start - n_sim * 12 * n_year ) + 1 temp_period = intarr( 2, n_sim ) temp = fix( strmid( temp_time_data[id_start], 0, 4 ) ) temp_period[*,0] = [ temp, temp + n_year - 1 ] for i_sim = 1, n_sim - 1 do begin temp_period[*,i_sim] = temp_period[*,i_sim-1] + n_year endfor temp_period = str( temp_period, length=4, filler='0' ) temp_period[0,*] = temp_period[0,*] + '0101' temp_period[1,*] = temp_period[1,*] + '1231' ; Expand data array if n_sim gt 1 then begin var_data = [ [ var_data ], $ [ !values.f_nan * fltarr( n_time, n_sim - 1, n_region ) ] ] endif ; Otherwise we will have normal extraction for PERIOD endif else if keyword_set( period ) then begin temp_period = str( period ) + [ '0101', '1231' ] n_sim = 1 endif else begin temp_period = 0 n_sim = 1 endelse ; Iterate through sections of the simulation if it is raw data if region_preprocessed_opt eq 0 then begin n_sim_raw = n_sim endif else begin n_sim_raw = 1 endelse for i_sim_raw = 0, n_sim_raw - 1 do begin ; See if we can meaningfully restrict the file list if guess_file_restrict_opt eq 1 then begin temp_file_list = file_search( temp_file, count=n_temp_file_list ) if n_temp_file_list eq 0 then stop temp_file_period = strarr( 2, n_temp_file_list ) for i_file = 0, n_temp_file_list - 1 do begin temp = strsplit( temp_file_list[i_file], '_.-', extract=1, $ count=n_temp ) temp_file_period[*,i_file] = temp[n_temp-3:n_temp-2] endfor if max( strlen( temp_file_period ) ) gt 6 then stop if region_preprocessed_opt eq 0 then begin temp = strmid( temp_period[*,i_sim_raw], 0, 6 ) endif else begin temp = strmid( [ temp_period[0,0], temp_period[1,n_sim-1] ], 0, 6 ) endelse id = where( ( temp_file_period[1,*] ge temp[0] ) $ and ( temp_file_period[0,*] le temp[1] ), n_id ) if n_id eq 0 then stop temp_file_list = temp_file_list[id] ; Otherwise take all requested files endif else begin temp_file_list = temp_file endelse ; Load data for this source ;print,temp_file_list[0] temp_time_units = 0 if region_preprocessed_opt eq 0 then begin temp_period_use = temp_period[*,i_sim_raw] endif else begin temp_period_use = [ temp_period[0,0], temp_period[1,n_sim-1] ] endelse temp_var_data_all = netcdf_read_geo_multitime( temp_file_list, var_label, $ period_time=temp_period_use, anomaly=anomaly, $ ref_period=temp_period_use, quiet=1, time=temp_time_data_all, $ lon=temp_lon_data, lat=temp_lat_data, label_in_lon=temp_label_in_lon, $ units_time=temp_time_units ) temp_var_data_all = reform( temp_var_data_all ) ; Iterate through sections of the simulation if this is preprocessed data if region_preprocessed_opt eq 0 then begin n_sim_proc = 1 endif else begin n_sim_proc = n_sim endelse for i_sim_proc = 0, n_sim_proc - 1 do begin ; Extract the data for this section if region_preprocessed_opt eq 0 then begin temp_var_data = temporary( temp_var_data_all ) temp_time_data = temporary( temp_time_data_all ) endif else begin id_1 = 12 * n_year * i_sim_proc id_2 = id_1 + 12 * n_year - 1 temp_var_data = temp_var_data_all[*,id_1:id_2] temp_time_data = temp_time_data_all[id_1:id_2] if (min(finite( temp_var_data)) eq 0) and (n_sim_proc gt 1) then stop ; Recalculate anomaly if anomaly eq 'absolute' then begin for i_region = 0, n_region - 1 do begin temp_var_data[i_region,*] = temp_var_data[i_region,*] $ - mean( temp_var_data[i_region,*], nan=1 ) endfor endif if anomaly eq 'fractional' then begin for i_region = 0, n_region - 1 do begin temp_var_data[i_region,*] = temp_var_data[i_region,*] $ / mean( temp_var_data[i_region,*], nan=1 ) endfor endif endelse n_temp_time = n_elements( temp_time_data ) ; Take spatial mean over region if region_preprocessed_opt eq 0 then begin temp_region_data = !values.f_nan * fltarr( n_temp_time, n_region ) for i_region = 0, n_region - 1 do begin temp = temp_var_data if region_save_opt eq 0 then begin temp_mask_data = mask_data[*,*,*,i_region] endif else begin temp_mask_data = add_dim( mask_data[*,*,i_region], 2, n_temp_time ) endelse temp_lon = temp_lon_data temp_lat = temp_lat_data process_lonlatmonth, temp, lon=temp_lon, lat=temp_lat, $ weight=temp_mask_data, integrate='mean=1,2', $ coverage=temp_coverage temp_region_data[*,i_region] = temp endfor temp_var_data = transpose( temp_region_data ) ; Extract data for requested regions endif else begin index = intarr( n_region ) for i_region = 0, n_region - 1 do begin id = where( temp_lon_data eq region_id[i_region], n_id ) if n_id ne 1 then stop index[i_region] = id[0] endfor temp_var_data = temp_var_data[index,*] endelse temp_var_data = reform( temp_var_data, n_region, n_temp_time ) ; Extract seasonal data if ( n_elements( season_id ) eq 1 ) $ and ( n_elements( season_len ) eq 1 ) then begin temp_var_data = months_to_seasons( temp_var_data, season_id, $ season_len ) if keyword_set( temp_coverage ) then begin temp_coverage = months_to_seasons( temp_coverage, season_id, $ season_len ) endif endif ; Take average over segments if time_seg_len gt 1 then begin temp_var_data_new = fltarr( n_region, n_time ) for i_time = 0, n_time - 1 do begin id_time = [ i_time, i_time + 1 ] * time_seg_len - [ 0, 1 ] temp_data = temp_var_data[*,id_time[0]:id_time[1]] ;temp_var_data_new[i_time] = total( temp_data, nan=1 ) $ ; / total( finite( temp_data ) ) if keyword_set( temp_coverage ) then begin temp_mask = temp_coverage[*,id_time[0]:id_time[1]] endif else begin temp_mask = float( finite( temp_data ) ) endelse temp_var_data_new[*,i_time] = total( temp_data * temp_mask, 2 ) $ / total( temp_mask, 2 ) endfor endif else begin temp_var_data_new = temp_var_data endelse ; Record data if region_save_opt eq 0 then begin var_data[*,ctr_set,*] = reform( transpose( temp_var_data_new ), $ n_time, 1, n_region ) endif ; Increment set counter ctr_set = ctr_set + 1 ; If we want to save the data to NetCDF file if region_save_opt eq 1 then begin ; Modify output file name out_file = file_search( temp_file_list, count=n_out_file ) for i_file = 0, n_out_file - 1 do begin temp_units = 0 temp = netcdf_read_geo( out_file[i_file], '', time=temp_time, $ units_time=temp_units, calendar=temp_calendar, quiet=1 ) temp_ym = convert_time_format( temp_time, temp_units, 'yyyymm', $ calendar=temp_calendar ) if i_file eq 0 then begin out_period = [ min( temp_ym ), max( temp_ym ) ] out_period_0 = [ min( temp_ym ), max( temp_ym ) ] endif else begin out_period = [ min( [ min( temp_ym ), out_period[0] ] ), $ max( [ max( temp_ym ), out_period[1] ] ) ] endelse endfor out_file = string_substitute( out_file[0], out_period_0[0]+'-', $ out_period[0]+'-', regex=1 ) out_file = string_substitute( out_file, '-'+out_period_0[1], $ '-'+out_period[1], regex=1 ) out_file = string_substitute( out_file, '/atmos/', $ '/atmos-'+region_realm+'/', regex=1 ) out_file = string_substitute( out_file, '_Amon_', $ '_Amon-'+region_realm+'_', regex=1 ) out_file = string_substitute( out_file, 'Amon/', $ 'Amon-'+region_realm+'/', regex=1 ) ; A directory change for CMIP6 (because of write permissions) spawn, 'echo $HOSTNAME', temp if strpos( temp, 'maui.niwa.co.nz' ) ge 0 then begin out_file = string_substitute( out_file, '/CMIP6/', '/CMIP6-fixed/', $ regex=1 ) endif ; Ensure directory exists out_dir = strsplit( out_file, '/', extract=1, count=n_temp ) out_dir = '/' + strjoin( out_dir[0:n_temp-2], '/' ) spawn, 'mkdir -p ' + out_dir ; Write data to NetCDF file temp_out_data = reform( transpose( temp_var_data_new ), n_temp_time, $ n_region ) temp_data_attribute_label = [ 'standard_name', 'long_name', 'units' ] temp_dim1_attribute_label = [ 'standard_name', 'long_name', 'units', $ 'calendar', 'axis' ] temp_dim1_attribute_value = [ '', '', temp_time_units, temp_calendar, $ '' ] temp_dim2_vector = [ [ region_id ], [ region_def ] ] temp_dim2_label = [ 'regionid', 'namelong' ] temp_dim2_type = [ 2, 7 ] temp_dim2_attribute_label = [ [ 'standard_name', 'long_name', '' ], $ [ 'standard_name', 'long_name', 'comment' ] ] temp_dim2_attribute_value = [ $ [ 'region_id', 'Region_Identifier', '' ], $ [ 'region_name', 'Region_Name', 'Definition of region' ] ] temp_global_attribute_label = [ 'title' ] temp_global_attribute_value = [ $ 'Processed data for analysis for IPCC AR6 WGII Mountains ' $ + 'Cross-chapter Paper.' ] netcdf_write, out_file, data_array=temp_out_data, $ data_label=var_label, $ data_attribute_label=temp_data_attribute_label, $ dim1_vector=temp_time_data, dim1_label='time', $ dim1_attribute_label=temp_dim1_attribute_label, $ dim1_attribute_value=temp_dim1_attribute_value, $ dim2_vector=temp_dim2_vector, dim2_label=temp_dim2_label, $ dim2_attribute_label=temp_dim2_attribute_label, $ dim2_attribute_value=temp_dim2_attribute_value, $ dim2_dim_label='region', $ global_attribute_label=temp_global_attribute_label, $ global_attribute_value=temp_global_attribute_value, $ driver_name='regiondna_conf_load_data.pro' tmp_command = 'ncpdq -a time,region ' + out_file + ' ' + out_dir $ + '/transposed.nc' spawn, tmp_command, exit_status=temp_status if temp_status ne 0 then stop spawn, 'mv ' + out_dir + '/transposed.nc ' + out_file endif endfor endfor endfor ;*********************************************************************** ; The end ;stop if region_save_opt eq 1 then var_data = !values.f_nan return, var_data END