#include <NLPInterfacePack_NLPFirstDerivTester.hpp>
Public Types  
FD_COMPUTE_ALL  
FD_DIRECTIONAL  
enum  ETestingMethod { FD_COMPUTE_ALL, FD_DIRECTIONAL } 
More...  
Public Member Functions  
STANDARD_COMPOSITION_MEMBERS (CalcFiniteDiffProd, calc_fd_prod)  
 
STANDARD_MEMBER_COMPOSITION_MEMBERS (ETestingMethod, fd_testing_method)  
 
STANDARD_MEMBER_COMPOSITION_MEMBERS (size_type, num_fd_directions)  
 
STANDARD_MEMBER_COMPOSITION_MEMBERS (value_type, warning_tol)  
 
STANDARD_MEMBER_COMPOSITION_MEMBERS (value_type, error_tol)  
 
NLPFirstDerivTester (const calc_fd_prod_ptr_t &calc_fd_prod=Teuchos::rcp(new CalcFiniteDiffProd()), ETestingMethod fd_testing_method=FD_DIRECTIONAL, size_type num_fd_directions=1, value_type warning_tol=1e8, value_type error_tol=1e3)  
Constructor.  
bool  finite_diff_check (NLP *nlp, const Vector &xo, const Vector *xl, const Vector *xu, const MatrixOp *Gc, const Vector *Gf, bool print_all_warnings, std::ostream *out) const 
This function takes an NLP object and its computed derivatives and function values and validates the functions and the derivatives by evaluating them about the given point x . If all the checks as described in the intro checkout then this function will return true, otherwise it will return false.  
Private Member Functions  
bool  fd_check_all (NLP *nlp, const Vector &xo, const Vector *xl, const Vector *xu, const MatrixOp *Gc, const Vector *Gf, bool print_all_warnings, std::ostream *out) const 
 
bool  fd_directional_check (NLP *nlp, const Vector &xo, const Vector *xl, const Vector *xu, const MatrixOp *Gc, const Vector *Gf, bool print_all_warnings, std::ostream *out) const 

There are two options for testing the derivatives by finite differences.
The first option (fd_testing_method==FD_COMPUTE_ALL
) is to compute all of them as dense vectors and matrices. This option can be very expensive in runtime and storage costs. The amount of storage space needed is O(n*m)
and f(x)
and c(x)
will be computed O(n)
times.
The other option (fd_testing_method==FD_DIRECTIONAL
) computes products of the form g'*v
and compares them to the finite difference computed value g_fd'*v
. This method only costs O(n)
storage and two function evaluations per direction (assuming central differences are used. The directions v
are computed randomly between [1,+1]
so that they are well scaled and should give good results. The option num_fd_directions()
determines how many random directions are used. A value of num_fd_directions() <= 0
means that a single finite difference direction of 1.0
will be used for the test.
This class computes the derivatives using a CalcFiniteDiffProd
object can can use up to fourthorder (central) finite differences but can use as low as firstorder onesided differences.
The client can set the tolerances used to measure if the anylitical values of Gf
and Gc
are close enough to the finite difference values. Let the function h(x) be f(x)
or any cj(x)
, for j = 1...m
. Let gh(i) = d(h(x))/d(x(i))
and fdh(i) = finite_diff(h(x))/d(x(i))
. Then let's define the relative error between the anylitic value and the finite difference value to be:
err(i) = (gh(i)  fdh(i)) / (ghinf + fdhinf + (epsilon)^(1/4))
The above error takes into account the relative sizes of the elements and also allows one or both of the elements to be zero without ending up with 0/0
or something like 1e16
not comparing with zero.
All errors err(i) >= warning_tol
are reported to *out
if out != NULL
and print_all_warnings==true
. Otherwise, if out != NULL
, only the number of elements and the maxinum violation of the warning tolerance will be printed. The first error err(i) >= error_tol
that is found is reported is reported to *out
if out != NULL
and immediatly finite_diff_check()
returns false
. If all errors err(i) < error_tol
then finite_diff_check()
will return true
.
Given these two tolerances the client can do many things:
max_var_bounds_viol
so that the testing software will never evaluate f(x) or c(x) outside the region: xl  max_var_bounds_viol <= x <= xu + max_var_bounds_viol
Definition at line 120 of file NLPInterfacePack_NLPFirstDerivTester.hpp.
NLPInterfacePack::NLPFirstDerivTester::NLPFirstDerivTester  (  const calc_fd_prod_ptr_t &  calc_fd_prod = Teuchos::rcp(new CalcFiniteDiffProd()) , 

ETestingMethod  fd_testing_method = FD_DIRECTIONAL , 

size_type  num_fd_directions = 1 , 

value_type  warning_tol = 1e8 , 

value_type  error_tol = 1e3  
) 
NLPInterfacePack::NLPFirstDerivTester::STANDARD_COMPOSITION_MEMBERS  (  CalcFiniteDiffProd  ,  
calc_fd_prod  
) 
NLPInterfacePack::NLPFirstDerivTester::STANDARD_MEMBER_COMPOSITION_MEMBERS  (  ETestingMethod  ,  
fd_testing_method  
) 
NLPInterfacePack::NLPFirstDerivTester::STANDARD_MEMBER_COMPOSITION_MEMBERS  (  size_type  ,  
num_fd_directions  
) 
NLPInterfacePack::NLPFirstDerivTester::STANDARD_MEMBER_COMPOSITION_MEMBERS  (  value_type  ,  
warning_tol  
) 
NLPInterfacePack::NLPFirstDerivTester::STANDARD_MEMBER_COMPOSITION_MEMBERS  (  value_type  ,  
error_tol  
) 
bool NLPInterfacePack::NLPFirstDerivTester::finite_diff_check  (  NLP *  nlp,  
const Vector &  xo,  
const Vector *  xl,  
const Vector *  xu,  
const MatrixOp *  Gc,  
const Vector *  Gf,  
bool  print_all_warnings,  
std::ostream *  out  
)  const 
This function takes an NLP object and its computed derivatives and function values and validates the functions and the derivatives by evaluating them about the given point x
. If all the checks as described in the intro checkout then this function will return true, otherwise it will return false.
nlp  [in] NLP object used to compute and test derivatives for.  
xo  [in] Point at which the derivatives are computed at.  
xl  [in] If != NULL then this is the lower variable bounds.  
xu  [in] If != NULL then this is the upper variable bounds. If xl != NULL then xu != NULL must also be true and visaversa or a std::invalid_arguement exceptions will be thrown.  
Gc  [in] A matrix object for the Gc computed at xo. If Gc==NULL then this is not tested for.  
Gf  [in] Gradient of f(x) computed at xo. If Gf==NULL then this is not tested for.  
print_all_warnings  [in] If true then all errors greater than warning_tol will be printed if out!=NULL  
out  [in/out] If != null then some summary information is printed to it and if a derivative does not match up then it prints which derivative failed. If out == 0 then no output is printed. 
true
if all the derivatives check out, and false otherwise. Definition at line 71 of file NLPInterfacePack_NLPFirstDerivTester.cpp.