Inverse Analysis in FEniCS
@ Lorenzo Fiore | Wednesday, May 8, 2024 | 12 minutes read

Inverse Analysis in FEniCS

Motivation and context

  • The FEniCS open-source computing platform is a wonderful collection of software, it contains all the tools needed to rigorously define a mathematical problem in computer code in a very high level form, it feels almost as writing the equations of physics and let the computer solve them for you!
  • Well… to be honest FEniCS only provides the backbone of a full fledged Finite Element Software, it’s like having a kit to be assembled according to ones taste: everything that goes from the definition of the equations of physics, to the handling of incremental solving procedure and output is a user-responsibility.
  • This can seem inconvenient to the user accustomed to full-fledged commercial Finite Element Software, but to me it’s marvelous! It’s the perfect platform to automate analysis and procedures, and I put it on test running an inverse analysis procedure: I defined a script with all the settings for the analysis of fracture in a Wedge Splitting Test and I wanted to find the material parameters more apt to reproduce the experimental curve that I had available.

Challenges I faced

  • For the purpose of the Inverse Analysis we can treat the simulation script as a black-box: material parameters go in and a force-displacement curve goes out (together with some info on time-stepping, elastic energy, fracture propagation, … whatever I want or I’m able to code!).
  • The task is thus “just” that to:
    1. compare the simulation output with the reference experimental one (for a simulation engineer, reality is the only truth and guidance);
    2. change the simulation parameters according to some smart logic to hope to be closer to reality;
    3. run another simulation with the updated parameters;
    4. loop the previous steps until satisfied.
  • Task 1. is what people refer to as “defining an objective function”: if the goal is that of being closer to the experimental results, we want to minimize some distance. What is this distance, how do we compute it? The good old sum of the square of distances comes in handy here.
import numpy as np

# read experimental data
experimental_data = {}
experimental_data["open mm"] = np.genfromtxt("experimental_data.md", usecols = 1)
experimental_data["HRF kN"] = np.genfromtxt("experimental_data.md", usecols = 2)

# read result file storing force and relative displacement
computed_data = {}
computed_data["open mm"] = np.genfromtxt("sim_ID_results.md", usecols = 1)
computed_data["HRF kN"] = np.genfromtxt("sim_ID_results.md", usecols = 2)

# interpolate experimental data at simulation time-steps
interpolated_experimental_data = np.interp(computed_data["open mm"], experimental_data["open mm"], experimental_data["HRF kN"])

# compute the objective function value
obj_func_val = np.sqrt(np.sum(np.square(computed_data["HRF kN"] - interpolated_experimental_data)))
  • Task 2. is a bit more involved, we got to understand how to change the parameters. As already hinted before, inverse analysis can be framed in the context of a minimization problem, so we can leverage the algorithms developed for these problems. The steepest descent algorithm with variable multiplier is my choice.
  • The idea is simple: identify the direction of maximum decrease of your objective function and take a step in that direction, but careful! Choose wisely the length of the step according to your previous knowledge.
  • For the computation of the gradient in this case there is no much choice, it has to be done with a finite differences numerical approach.
  • The step length computation is a bit more interesting, the idea is to test multiple step lengths, identify the one that leads to the minimum value of the objective function and keep this information for the next iteration. This means more evaluations of the objective function at each iteration but, hopefully, fewer iterations with respect to a fixed step length approach.
  • A comment regarding objective function evaluation: each evaluation means a finite element computation so the fewer the better of course! Additionally if we ask for the value of the function in the same point twice we shouldn’t run the related simulation twice, so it’s better to store the information somewhere.
  • Storing information is a good idea overall and it’s a good practice that I try to implement in all my scripts: whenever I have to run a long lasting calculation I always make sure to write the main info of each step in a simple text file. It’s a good way to understand if everything is running as intended and a good starting point for post-processing of results.
[min]	[sim N]	 [alpha]  	  [Emod]  	[obj_val] 	 [grad 0] 	 [update] 	
 26  	  2  	   0.1    	 1.31e+09 	  1003.2  	    -     	    -     	
 26  	  2  	   0.1    	 1.3e+09  	  1013.9  	-1.0664e+09	    0     	
 55  	  3  	   0.1    	1.4066e+09	  900.54  	    -     	    -     	
 55  	  4  	   0.2    	1.5133e+09	  788.59  	    -     	    1     	
 55  	  4  	   0.1    	1.4066e+09	  900.54  	    -     	    -     	
 69  	  5  	   0.2    	1.5233e+09	  778.98  	    -     	    -     	
 69  	  5  	   0.2    	1.5133e+09	  788.59  	-9.6115e+08	    1     	
 102 	  6  	   0.2    	1.7055e+09	  619.22  	    -     	    -     	
 102 	  7  	   0.4    	1.8977e+09	  498.15  	    -     	    2     	
 102 	  7  	   0.2    	1.7055e+09	  619.22  	    -     	    -     	
 119 	  8  	   0.4    	1.9077e+09	  495.15  	    -     	    -     	
 119 	  8  	   0.4    	1.8977e+09	  498.15  	-3.0014e+08	    2     	
 160 	  9  	   0.4    	2.0178e+09	  462.09  	    -     	    -     	
 160 	 10  	   0.8    	2.1378e+09	  423.51  	    -     	    3     	

Useful lessons I learned

  • This project was the perfect excuse to un-rust my usage of classes in python, most of my coding is for automation of repetitive tasks and I don’t get to use the object-oriented paradigm a lot for that. In this case thought I wanted to create an internal counter for the objective function incrementing every time the function itself is called and I ended up doing so with an Instance method parameter.
  • The script I ended up with is reported down here and it can be used with whatever Finite Element Software provided that such software can take command line input and output results in text format.
"""
- The script is meant to calculate material properties for a material
  by performing inverse analysis on experimental results using the 
  constitutive model with phase field fracture here implemented in
  FEniCS
- This script contains only the inverse analysis procedure with
  a simple steepest descent algo while the FEM simulation and the 
  related settings are implemented in other scripts (see WST.py)
- The objective function is embedded in a class that creates it 
  taking as input the FEM model file, the material data and the 
  labels of the parameters to identify
"""

#%% ---------------------------------------------------------------70
# Import necessary packages
# -----------------------------------------------------------------70
import os
import sys
import time
from datetime import datetime
import numpy as np


#%% ---------------------------------------------------------------70
# Control parameters
# -----------------------------------------------------------------70

material_data = "lab_test_240416T1520"

experimental_data_file = material_data + "_Fd_data.csv"

FEM_file = "WST.py"

parameters_labels = ["Emod"]

# initial guess consistent with the choice for the sim parameters
initial_guess = [1.3e9]

# setup limits for the parameters space
limits = [
    [0.8e9, 50e9]
]

# perturbation for the calculation of the gradient of objective function
perturbation = [0.01e9]

# scale of the parameters 
scale_factor = [1e15]

# columns identifiers for the simulation output file
interested_cols_identifiers = ["open mm", "HRF kN"]

# initialize step length
alpha = 0.1

# allow the step length multiplier to change
explore_alpha_variation_flag = 1

# set the increasing-decreasing alpha factor
exploration_factors_list = [1, 2, 0.5]

# tolerance for the minimization of the objective function
tolerance = 0.1

# iteration limit for the minimization algo
max_sims_counter = 50 

# allow objective function to increase for a certain number of times
max_objective_function_increase_counter = 10

# maximum number of inverse analysis iterations
max_iterations = 100

# specify if you want to timestamp the output or not
timestamp_flag = 1


#%% ---------------------------------------------------------------70
# Filesystem settings
# -----------------------------------------------------------------70

folder_descriptor = "Inverse_Analysis_" + material_data

now = datetime.now()
dt_string = now.strftime("%Y%m%dT%H%M")[2:]

if timestamp_flag:
    outfolder = dt_string + "_" + folder_descriptor
else:
    outfolder = folder_descriptor

# create output folder
os.system("mkdir " + outfolder)

# copy this file to the inverse analysis folder for forensics
os.system("cp " + __file__ + " ./" + outfolder + "/" + 
        __file__.split("/")[-1].replace(".py", "") +
        "_" + dt_string + ".py" )

# copy the sim file to the inverse analysis folder for forensics
os.system("cp " + FEM_file + " ./" + outfolder + "/" + 
        FEM_file.split("/")[-1].replace(".py", "") +
        "_" + dt_string + ".py" )

# copy the material data file to the inverse analysis folder for forensics
os.system("cp " + material_data + ".py" + 
            " ./" + outfolder + "/" + material_data + "_" + dt_string + ".py")

# inverse analysis output file
outfile = outfolder + "/" + "README.md"

# prompt for forensic
readme = input("Describe why to run this inverse analysis procedure: \n")
with open(outfile, "w") as file:
    file.writelines(readme)


#%% ---------------------------------------------------------------70
# Objective function definition
# -----------------------------------------------------------------70

def fileprint(string):
    print(string)
    with open(outfile, "a") as file:
        file.writelines( "\n" + string)


def retrieve_info(file, interested_cols_identifiers):
    # take out header string
    with open(file, "r") as file_opener:
        head = [next(file_opener) for _ in range(3)]
    header_string = head[1]
    
    # identify columns corresponding to interested quantities
    interested_cols = {}
    character_counter = 0
    word_counter = -1
    close = -1
    for character in header_string:
        character_counter += 1
        if character == "[":
            init = character_counter
        elif character == "]":
            close = character_counter
            identifier = header_string[init:close-1]
            word_counter += 1
            if identifier in interested_cols_identifiers:
                interested_cols[identifier] = word_counter
    
    computed_data = {}
    for identifier in interested_cols_identifiers:
        computed_data[identifier] = np.genfromtxt(
                            file,
                            skip_header = 2, 
                            skip_footer = 2, 
                            usecols = interested_cols[identifier]
                        )
    return computed_data


class objective_function_creator():
    def __init__(self, parameters_labels, FEM_file, experimental_data, material_data, interested_cols_identifiers, outfolder):
        self.parameters_labels = parameters_labels
        self.FEM_file = FEM_file
        self.experimental_data = experimental_data
        self.material_data = material_data
        self.sims_counter = 0
        self.interested_cols_identifiers = interested_cols_identifiers
        self.outfolder = outfolder
        self.nparams = len(parameters_labels)
    
    def evaluate(self, params):
        """inverse analysis objective function
        Inputs: 
            - the parameters to optimize for
            - FEM simulation file 
            - experimental data file
        Output: a single scalar, sqrt of sum of the differences between 
                experimental and calculated at each sim time step
        """
        
        # pack the parameters in an identification string 
        sim_ID = ""    
        for i in range(self.nparams):
            sim_ID += self.parameters_labels[i] + "_" + str(params[i]) + "_"
        
        # check if simulation was already run
        sim_already_run_flag = 0
        for file in os.listdir(self.outfolder):
            if file == sim_ID + ".md":
                sim_already_run_flag = 1
        
        # run sim if necessary
        if sim_already_run_flag == 0:
            # printout a message to monitor inverse analysis procedure
            print(f"running sim: " + sim_ID)
            
            # launch the FEM with the input parameters
            os.system("python " + self.FEM_file + " " + self.material_data + " " + sim_ID + " " + self.outfolder)
            
            # update simulations counter
            self.sims_counter += 1
            
            # check completion
            completed_flag = 0
            while completed_flag == 0:
                for file in os.listdir("./" + outfolder + "/"):
                    if file == "simulation_completed.md":
                        completed_flag = 1
            
            # delete the flag file
            os.system("rm " + self.outfolder + "/" + "simulation_completed.md")

        # read result file
        computed_data = retrieve_info(self.outfolder + "/" + sim_ID + ".md", self.interested_cols_identifiers)
        
        # interpolate experimental data at simulation time-steps
        interpolated_experimental_data = np.interp(computed_data["open mm"], self.experimental_data[:,0], self.experimental_data[:,1])*1e-3
        
        # be careful with the consistency of the units of measure, the factor 1e3 serves this purpose
        obj_func_val = 1e3*np.sqrt(np.sum(np.square(computed_data["HRF kN"] - interpolated_experimental_data)))

        return obj_func_val


def update_params(paramaters_array, increment, limits):
    # update parameters within limits
    
    # careful with the += operator, it updates the input too !!!
    paramaters_array = paramaters_array + increment
    
    for ii in range(len(params)):
        paramaters_array[ii] = max(limits[ii][0], paramaters_array[ii])
        paramaters_array[ii] = min(limits[ii][1], paramaters_array[ii])
        
    return paramaters_array


def forensic_printout(sims_counter, alpha, params, obj_func_val, gradient, update):
    # printout CPU time
    info_string = f"{int((time.time() - tic)/60):^5}\t"

    # printout number of simulations run
    info_string += f"{sims_counter:^5}\t"

    # printout alpha value
    info_string += f"{alpha:^10.5g}\t"

    # printout parameters
    for param in params:
        info_string += f"{param:^10.5g}\t"

    # printout objective function value
    info_string += f"{obj_func_val:^10.5g}\t"

    # printout gradient components
    for ii in range(nparams):
        try:
            info_string += f"{gradient[ii]:^10.5g}\t"
        except:
            info_string += f"{gradient[ii]:^10}\t"

    # printout placeholder for the update flag
    info_string += f"{update:^10}\t"
    
    fileprint( info_string )

#%% ---------------------------------------------------------------70
# Messages to the user
# -----------------------------------------------------------------70

# number of parameters
nparams = len(parameters_labels)

# README file heading
info_string = f"{'[min]':^5}\t" +\
              f"{'[sim N]':^5}\t" +\
              f"{'[alpha]':^10}\t"

for parameter in parameters_labels:
    parameter = '[' + parameter + ']'
    info_string += f"{parameter:^10}\t"

info_string += f"{'[obj_val]':^10}\t"

for ii in range(nparams):
    info_string += f"{'[grad ' + str(ii) + ']':^10}\t"

info_string += f"{'[update]':^10}\t"

fileprint( info_string )

# create dummy grad list for forensic printout
dummy_grad = []
for ii in range(nparams):
    dummy_grad.append('-')

#%% ---------------------------------------------------------------70
# Objective function optimization
# -----------------------------------------------------------------70

# read experimental data file
experimental_data = np.loadtxt( experimental_data_file )

# create the objective function specifying the material data
objective_function = objective_function_creator(
    parameters_labels, 
    FEM_file, 
    experimental_data, 
    material_data, 
    interested_cols_identifiers, 
    outfolder
)

# setup control variables for the minimization algo
params = np.array(initial_guess)
scale_factor = np.array(scale_factor)
perturbation_matrix = np.diag(perturbation)
update_counter = 0
obj_func_val_min = np.inf
objective_function_increase_counter = 0
iteration = 0

# measure elapsed time for inverse analysis procedure
tic = time.time()

# minimize objective function
while iteration <= max_iterations:
    iteration += 1
    
    if objective_function.sims_counter > max_sims_counter:
        fileprint("Max number of allowed simulations reached.")
        break
    
    # compute objective function
    obj_func_val = objective_function.evaluate(params)
    
    # check if tolerance has been reached
    if obj_func_val < tolerance:
        # gradient evaluation is not necessary
        forensic_printout(objective_function.sims_counter, alpha, params, obj_func_val, dummy_grad, update_counter)
        fileprint("Tolerance reached, procedure exited successfully !!!")
        break 
    
    # check that objective function is not increasing
    if obj_func_val >= obj_func_val_min:
        objective_function_increase_counter += 1
        if objective_function_increase_counter > max_objective_function_increase_counter:
            fileprint(f"Objective function has increased {objective_function_increase_counter} times, exiting inverse analysis procedure.")
            break    

    # update historic value of objective function
    obj_func_val_min = min(obj_func_val, obj_func_val_min)
    
    # start gradient computation by initializing empty gradient list
    gradient = []
    
    for ii in range(nparams):
        
        # compute perturbed params using perturbation
        # perturbed_params = params + perturbation_matrix[:,ii]
        perturbed_params = update_params(params, perturbation_matrix[:,ii], limits)
   
        # compute objective function value at perturbed parameters
        perturbed_params_obj_func_val = objective_function.evaluate(perturbed_params)
        
        # printout for forensic at each objective function evaluation
        forensic_printout(objective_function.sims_counter, alpha, perturbed_params, perturbed_params_obj_func_val, dummy_grad, '-')
        
        # update the gradient list
        gradient.append( ( perturbed_params_obj_func_val - obj_func_val )/perturbation[ii] )
    
    # store gradient information in a numpy array
    gradient = np.multiply( scale_factor, np.array(gradient) )
    
    # normalize gradient in case it is multi-variables
    # if nparams > 1:
        # gradient = gradient/np.linalg.norm(gradient)
    
    # printout for forensic after the gradient calculation
    forensic_printout(objective_function.sims_counter, alpha, params, obj_func_val, gradient, update_counter)
        
    # update the counter after the gradient calculation
    update_counter += 1
    
    # update alpha if necessary
    if explore_alpha_variation_flag:
        exploration = {}
        exploration["sims_counter"] = []
        exploration["alpha"] = []
        exploration["params"] = []
        exploration["obj_func_val"] = []
        exploration["update"] = []
        for factor in exploration_factors_list:
            alpha = factor*alpha
            # exploration_params = params - alpha*gradient
            exploration_params = update_params(params, -alpha*gradient, limits)
            exploration_obj_func_val = objective_function.evaluate(exploration_params)
            
            # save data for identification and forensic
            exploration["sims_counter"].append(objective_function.sims_counter)
            exploration["alpha"].append(alpha)
            exploration["params"].append(exploration_params)
            exploration["obj_func_val"].append(exploration_obj_func_val)
            exploration["update"].append('-')
    
        # identify which alpha values corresponds to the min objective function
        min_index = np.argmin(exploration["obj_func_val"])
        
        # update alpha with value corresponding to min objective function 
        alpha = exploration["alpha"][min_index]
        
        # flag the chosen alpha
        exploration["update"][min_index] = update_counter
        
        for ii in range(len(exploration_factors_list)):
            # printout for forensic at each objective function evaluation
            forensic_printout(
                exploration["sims_counter"][ii],
                exploration["alpha"][ii],
                exploration["params"][ii],
                exploration["obj_func_val"][ii],
                dummy_grad,
                exploration["update"][ii]
            )
    
    # update parameters with the steepest descent algo
    # params += -alpha*gradient
    params = update_params(params, -alpha*gradient, limits)

if iteration == max_iterations:
    fileprint(f"Max iterations reached")

fileprint(f"Elapsed CPU time: {int((time.time() - tic)/60):5d} [min]")

© 2021 - 2024 MechEngrLorenzoFiore

Powered by Hugo with theme Dream.

avatar

Lorenzo Fiore's blog
Leave the world a better place than you find it

Once a Scout, always a Scout!

I’m a proud member of the World Wide Scout Movement.

Try and leave this world a little better than you found it, and when your turn comes to die, you can die happy in feeling that at any rate, you have not wasted your time but have done your best.

Robert Baden-Powell, fouder of the Scout Movement

About me

If you came up here you want to know a little bit more about myself. That’s awesome!

I’m a Mechanical Engineer, a curious person and a maker! I don’t waste any time trying to learn everything I can about the topic I love the most, be it mechanical design, programming, old school hand abilities, music, languages, you name it!

This websites purpose is to share my knowledge and experience hoping this would be of any help for anyone facing the challenges I face. Enjoy and let’s stay in contact!

Social Links