FSUMATH
Florida State University Seal

This Week in Mathematics


TWIM RSS Feed
>> Next Week [2021-03-07 - 2021-03-13] >> Beyond Next Week [2021-03-13+]
<< View Previously Scheduled Events
Current Week [Feb 28, 2021 - Mar 06, 2021]
Next MonthMar
Feb - Mar
S M T W R F S
 123456
78910111213
14151617181920
21222324252627
28123456
Today:
Colloquium Tea
Time: 3:00 pm Room: 204 LOV

2020-21 Mathematics Colloquium [url]
Artificial neural networks for solving differential equations
    - Marios Mattheakis, Harvard University
Time: 3:05pm Room: Zoom
Abstract/Desc: There has been a wave of interest in applying machine learning to study differential equations. The universal approximation theorem states that a neural network can approximate any continuous function with arbitrary accuracy. Moreover, the obtained predictions are analytical and differentiable, making neural networks a suitable approach to solving complicated problems governed by differential equations. In contrast to conventional data-driven machine learning methods, neural network solvers are equation-driven models that construct analytical functions that satisfy a particular differential equations system. Subsequently, the optimization is a fully data-free process resulting in an unsupervised learning method. The resulting network solvers are mesh-free and can predict solutions that identically satisfy any boundary or initial conditions, and have an analytical form of the general solution, namely a solution as a function of the variables, initial and boundary conditions. A vital consequence of this is that they are differentiable, and they can be inverted. This presentation will review the formulation of neural network solvers and discuss recent advances in solving ordinary, partial, and eigenvalue differential equation problems.

Entries for this week: 6
Tuesday March 02, 2021

Topology and Geometry Seminar [url]
Zariski dense surface groups in SL(2k+1,Z)
    - Darren Long, UC Santa Barbara
Time: 3:05p Room: Zoom
More Information
Abstract/Desc: I'll introduce some of the history of thin groups and discuss a proof that there are Zariski dense surface groups in SL(2k+1,Z).

Wednesday March 03, 2021

Departmental Tea Time
C is for cookie, and shorthand for C[0,1] w/the sup norm
Time: 3: Room: 204 LOV

Thursday March 04, 2021

Financial Mathematics Seminar [url]
Maximum principle for stochastic control of SDEs with measurable drifts
    - Ludovic Tangpi, Princeton University
Time: 3:05pm Room: fsu.zoom.us/j/97820191506
Abstract/Desc: Consider the stochastic optimal control of systems driven by stochastic differential equations with irregular drift coefficient. In this talk, we will present a necessary and sufficient stochastic maximum principle. We will discuss the main ideas leading to the maximum principle, including an explicit representation of the first variation process (in Sobolev sense ) of the controlled diffusion and the application of Ekeland's variational principle to construct suitable approximating control problems with smooth coefficients. The talk is based on a joint work with Olivier Menounkeu-Pamen.

Algebra and Its Applications [url]
Average Height Bounds
    - Kate Petersen, FSU
Time: 3:05 Room: Zoom

Friday March 05, 2021

Colloquium Tea
Time: 3:00 pm Room: 204 LOV

2020-21 Mathematics Colloquium [url]
Artificial neural networks for solving differential equations
    - Marios Mattheakis, Harvard University
Time: 3:05pm Room: Zoom
Abstract/Desc: There has been a wave of interest in applying machine learning to study differential equations. The universal approximation theorem states that a neural network can approximate any continuous function with arbitrary accuracy. Moreover, the obtained predictions are analytical and differentiable, making neural networks a suitable approach to solving complicated problems governed by differential equations. In contrast to conventional data-driven machine learning methods, neural network solvers are equation-driven models that construct analytical functions that satisfy a particular differential equations system. Subsequently, the optimization is a fully data-free process resulting in an unsupervised learning method. The resulting network solvers are mesh-free and can predict solutions that identically satisfy any boundary or initial conditions, and have an analytical form of the general solution, namely a solution as a function of the variables, initial and boundary conditions. A vital consequence of this is that they are differentiable, and they can be inverted. This presentation will review the formulation of neural network solvers and discuss recent advances in solving ordinary, partial, and eigenvalue differential equation problems.


Problems? Email webmaster@math.fsu.edu.