2:30-3:30 pm
September 4: Alexei Mishchenko (Omsk University), Universal invariants for classes of abelian groups
Abstract: The problem of elementary equivalence for abelian groups was solved by W. Szmielew in 1955. She introduced a criterion of elementary equivalence of two abelian groups in terms of elementary invariants. We define universal invariants for arbitrary class of abelian groups and prove analogues of Szmielew's theorem for universal classes of abelian groups. We introduce the concept of the principal universal class as a universal class generated by a group A. For each principal universal class we introduce a canonical group so that the two principal universal classes are equal if and only if they have the same canonical group.
October 16: Matvei Kotov (Stevens Institute of Technology),
Analysis of a key exchange protocol based
on tropical matrix algebra
Abstract: In this talk I will discuss a two party key exchange protocol proposed in [D. Grigoriev and V. Shpilrain, Tropical cryptography, Comm. Algebra 42 (2014), 2624--2632] which uses a tropical matrix algebra as the platform. Our analysis shows that there is a heuristic algorithm which can find private keys in reasonable time. This is joint work with Alexander Ushakov.
October 23: Maggie Habeeb (California University of Pensylvania), Verifiable secret sharing.
Abstract: A verifiable secret sharing scheme is a secret sharing scheme
in which the participants can check that their shares are correct, solving the problem of a dishonest dealer. In this talk, we will review the verifiable secret sharing scheme introduced by Feldman and propose a
verification protocol motivated by this to be utilized with the Habeeb-Kahrobaei-Shpilrain secret sharing scheme.
October 30: Anna Choromanska (NYU), Optimization for large-scale machine learning: large data and large model
Abstract: The talk will focus on selected challenges in modern large-scale machine learning in two settings: i) large data setting and ii) large model (deep learning) setting. The first part of the talk will focus on the case when the learning algorithm needs to be scaled to large data. The multi-class classification problem will be addressed, where the number of classes (k) is extremely large, with the goal of obtaining train and test time complexity logarithmic in the number of classes. A reduction of this problem to a set of binary classification problems organized in a tree structure will be discussed. A top-down online tree construction approach for constructing logarithmic depth trees will be demonstrated, which is based on a new objective function. Under favorable conditions, the new approach leads to logarithmic depth trees that have leaves with low label entropy. Discussed approach comes with theoretical guarantees following from convex analysis, though the underlying problem is inherently non-convex. The second part of the talk focuses on the theoretical analysis of more challenging non-convex learning setting, deep learning with multilayer networks. Despite the success of convex methods, deep learning methods, where the objective is inherently highly non-convex, have enjoyed a resurgence of interest in the last few years and they achieve state-of-the-art performance. In the second part of the talk we move to the world of non-convex optimization where recent findings suggest that we might eventually be able to describe these approaches theoretically. The connection between the highly non-convex loss function of a simple model of the fully-connected feed-forward neural network and the Hamiltonian of the spherical spin-glass model will be established. It will be shown that under certain assumptions i) for large-size networks, most local minima are equivalent and yield similar performance on a test set, (ii) the probability of finding a “bad” local minimum, i.e. with high value of loss, is non-zero for small-size networks and decreases quickly with network size, (iii) struggling to find the global minimum on the training set (as opposed to one of the many good local ones) is not useful in practice and may lead to overfitting. Discussion of open problems concludes the talk.
November 6: Anja Moldenhauer (Hamburg University), Cryptosystems using automorphisms of finitely generated free groups
Abstract: I will introduce cryptosystems which are based on Nielsen reduced sets and automorphisms of finitely generated free groups. A random choice of these automorphisms is needed. We give an approach for the random choice which uses Whitehead automorphisms.
If we also use the matrix group SL(2, Q) for the cryptosystems, then the security certification depends in addition on the membership problem in SL(2, Q). At the moment there is no algorithm known to solve this problem.
November 20: Jordi Delgado (Universitat Politecnica de Catalunya), Stallings graphs and pullbacks for Z^m x F_n
Abstract: We extend to the family of free-abelian-times-free groups (Z^m x F_n) the
classical Stallings theory for describing subgroups of the free group as automata.
This approach provides a geometric way of understanding several algorithmic
results already proved in [Delgado, Ventura, Algorithmic problems for free-abelian times free groups. J.
Algebra 391 (2013), 256-283], and naturally reveals some unexpected facts about this
apparently naive family.
In particular, I will discuss an adaptation to this new framework of the well-
known pullback technique to deal with intersections of finitely generated subgroups.
This is joint work with Enric Ventura.
December 4: Manhattan Algebra Day
To subscribe to the seminar mailing list, click here