|
|||||||||||||||||
|
TalksAbstract: In
linear optics an optical system (including an eye) is completely
characterized by a 5x5 matrix with a special structure, called the ray
transference. The top-left 4x4 submatrix of a ray transference is
symplectic. An important issue in the quantitative analysis of optical
systems is the question of how to calculate an average of a set of eyes
or other optical systems. Mathematically, this corresponds to averaging
a set of matrices (ray transferences), providing that some requirements
are fulfilled [1], [2]. In particular, the average of ray transferences
must be a ray transference. In this talk we discuss this problem and
propose a solution involving well-known matrix functions such as the
matrix exponential, the matrix logarithm and the Cayley-transform. New
mathematical issues arising in our research are pointed out.
Abstract: Both
market practices and regulation are based on the strange
principle
that economic relations are established in a random fashion. Since that
is not true, we will show based on economics first principles how the
"non-randomness" of the economic relations influence the geometry of
the economic environment and how this geometry influences the risk
measures and the behavior of economic related metrics. The issue of
model risk will be discussed as a source of market opportunities.
Abstract: Wavelet analysis is a very promising tool
as it represents a refinement of Fourier analysis. In particular, it
allows one to take into account both the time and frequency domains
within a unified framework, that is, one can
assess simultaneously how variables are related at different frequencies and how such relationship has evolved over time. Despite the potential value of wavelet analysis, it is still a relatively unexplored tool in the study of economic phenomena. To highlight the usefulness of wavelet analysis in Economics and Finance, several examples are provided covering a wide range of topics such as forecasting, comovement analysis, market risk assessment, among others.
Abstract: The quantification of operational risk has
to deal with various concerns regarding data, much more than other
types of risk which banks and insurers are obliged to manage. Several
studies, at first more empirical and at present already more
theoretical and mathematical supported, document several of those
concerns. One of the main questions that worries both researchers and
practitioners is the bias in the data on the operational losses amounts
recorded. We support the assertions made by several authors and defend
that this concern is serious when modeling operational losses data and,
typically, is presented in all the databases, not only in the
commercial databases provided by various vendors, but also in databases
where the data for operational losses is collected and compiled
internally.
We show that it's possible, based on mild assumptions on the internal procedures put in place to manage operational losses, to make parametric inference using loss data statistics, that is, to estimate the parameters for the losses amounts, taking in consideration the bias that, not being considered, generates a two fold error n the estimators for the mean loss amount and the total loss amount, the former being overvalued and the last undervalued. We follow a different approach to the parametric inference. We do not consider the existence of a threshold for which, all losses above, are reported and available for analysis and estimation procedures. Here we consider that the probability that a loss is reported and ends up recorded for analysis, increases with the size of the loss, what causes the bias in the database but, at the same time, we don't consider the existence of a threshold, above which, all losses are recorded. Hence, no loss has probability one of being recorded, in what we defend is a realist framework. We deduce the general formulae, present some results for common theoretical distributions used to model (operational) losses amounts and estimate the impact for not considering the bias factor when estimating the value at risk.
Abstract: The objective of this work is to develop a
mathematical framework for the modeling, control and optimization of
dynamic control systems whose state variable is driven by interacting
ODE’s and PDE’s. To the best of our knowledge, there are no optimal
control results for such systems, and results on optimal control of
systems with dynamics given by PDEs have been developed only for
certain classes of problems, [5]. This framework should provide a sound
basis for the design and control of new advanced engineering systems
arising in many important classes of applications, some of which
encompass gliders, [1,2], and mechanical fishes, [3], that overcomes
the shortcomings of the currently available, often heuristic-based,
“mixed” approaches combining ODE control systems results and numerical
techniques.
The general approach consists in designing a family of well posed, robust, conventional optimal control problems with dynamics converging to the ones of the original “hybrid” system, and by obtaining the characterization of its solution as a certain type of limit of the conventional conditions, [4,6], for the approximating problems. For now, the research effort has been focused in gaining insight by applying necessary conditions of optimality for planar Couette flow driven dynamic control systems which can be easily reduced to problems with ODE dynamics. In particular, the minimum time control problem of moving a particle between two given points subject to certain classes of simple flows have been solved by using the maximum principle. Abstract: Recent developments on financial markets
have revealed the limits of Brownian motion pricing models when they
are applied to actual markets. Lévy processes, that admit jumps over
time, have been found more useful for applications. Thus, we suggest a
Lévy model based on Forward-Backward Stochastic Differential Equations
(FBSDEs) for option pricing in a Lévy-type market. Moreover, we discuss
the existence of replicating portfolios and present an analog of the
Black–Scholes PDE in a Lévy-type market.
Abstract: Banks face a regulatory and business need
to calculate through the cycle probabilities of default. This
presentation focuses on some techniques currently used in the market
place and the future landscape
|
||||||||||||||||