Fall 2018 Graduate Student Seminar



Meeting Tuesdays at 12:00pm in room WXLR A202.

Date Speaker Title & Abstract (click the title to expand)
08/28 Dylan Weber

In this talk we consider two agent-based models of opinion formation on networks; one stochastic in nature and one deterministic. Both carry assumptions of local consensus, i.e. when agents interact they "agree". We investigate global consensus in both models and compare features of both models. In both cases we find that convergence to a consensus is determined by the structure of the network on which the agents interact and provide a condition on the network structure equivalent to unconditional convergence to a consensus in both the stochastic and deterministic case. This similarity is explained as a mathematical link between the models is exposed. Finally, we discuss some preliminary thoughts on how results from these models could be leveraged to answer questions concerning consensus in more complex models.

09/11 David Polletta

In this talk I will discuss how all Hyperbolic Groups have a solvable word problem and can be given a finite group presentation. The main points of the argument deal with showing Hyperbolic Groups can be given a so called "Dehn Presentation", and that groups with a Dehn Presentation have a solvable word problem and are finitely presented. We will also review Cayley Graphs for finitely generated groups, delta-hyperbolic metric spaces, and the definition of a hyperbolic group.

10/02 Joseph Wells

Continued fractions have a very rich history and a number of surprising applications, from integer factorization algorithms to classification of rational tangles in knot theory. Due to their usefulness, it has been asked whether there are any natural generalizations of continued fractions, and past efforts have provided answers to this question with varying degrees of success. In this talk, I'll present one such generalization to the Heisenberg group, due to the recent work of Lukyanenko and Vandehey (arXiv).

10/30 Lauren Crider

Current work is focused on developing a stochastic filter on the Grassmann manifold, and that is what I want to chat to you about. Through a short and sweet background in statistical signal processing, we'll see how one might run into the Grassmann manifold in real life (you've got to be careful out there!) and we'll present a scenario wherein the goal will be to detect, estimate, and make predictions all on the surface of this Kähler manifold. A very well known stochastic filter is serving as baseline motivation for this scenario, so we'll chat about that too.