ὅδε οἶκος, ὦ ἑταῖρε, μνημεῖον ἐστιν ζῴων τῶν σοφῶν ἀνδρῶν, καὶ τῶν ἔργων αὐτῶν

ARTIFICIAL INTELLIGENCE Seminar

 

PROGRAM


Plan rada Seminara iz veštačke inteligencije za APRIL 2021.



Registraciona forma za učesće, i link na predavanje ako ste već registrovani:
https://miteam.mi.sanu.ac.rs/asset/CW5nJWDSEZDj7p32p
Ukoliko želite samo da gledate predavanje bez mogućnosti aktivnog učešća, prenos će biti dostupan na:
https://miteam.mi.sanu.ac.rs/asset/4LNW8WtML7rLKojoz
Na ovom linku se mogu pronaci kratka uputstva na srpskom i engleskom:
https://miteam.mi.sanu.ac.rs/asset/Kc7qJtEvoMFx9MFnz



SREDA, 07.04.2021. u 19:00, Live stream
Predrag Janičić, Matematički fakultet, Univerzitet u Beogradu
AUTOMATSKO REZONOVANJE I PRIMERI SISTEMA ZA REZONOVANJE U ISKAZNOJ LOGICI, U LOGICI PRVOG REDA I U GEOMETRIJI
U predavanju će biti dat kratak prikaz oblasti automatskog rezonovanja, posebno centralne podoblasti – automatskog dokazivanja teorema, kao i nekih oblasti primena. Biće ukratko prikazano i nekoliko predavačevih sistema koji koriste automatsko rezonovanje u iskaznoj logici, u logici prvog reda i u geometriji i biće opisane neke njihove konkretne primene.

SREDA, 14.04.2021. u 19:00, Live stream
Pavle Subotić, Azure Data Labs
DEBUGGING LARGE SCALE DATALOG WITH PROOF ANNOTATIONS
Logic programming languages such as Datalog have become popular as Domain Specific Languages (DSLs) for solving large-scale, real-world problems, in particular, static program analysis, graph databases and network analysis. The logic specifications that model analysis problems process millions of tuples of data and contain hundreds of highly recursive rules. As a result, they are notoriously difficult to debug. While the database community has proposed several data provenance techniques that address the Declarative Debugging Challenge for Databases, in the cases of analysis problems, these state-of-the-art techniques do not scale.
In this talk, I introduce a novel bottom up Datalog evaluation strategy for debugging: Our provenance evaluation strategy relies on a new provenance lattice that includes proof annotations and a new fixed-point semantics for semi-naïve evaluation. A debugging query mechanism allows arbitrary provenance queries, constructing partial proof trees of tuples with minimal height. We integrate our technique into Soufflé, a Datalog engine that synthesizes C++ code, and achieve high performance by using specialized parallel data structures. Experiments are conducted with DOOP/DaCapo, producing proof annotations for tens of millions of output tuples. We show that our method has a runtime overhead of 1.31× on average while being more flexible than existing state-of-the-art techniques.
This is joint work with David Zhao and Prof. Bernhard Scholz from the University of Sydney. A version of this talk was presented at POPL this year (2021).

SREDA, 21.04.2021. u 19:00, Live stream
Branislav Kisačanin, Nvidia; Institut br.ai.ns
RAČUNARSKE ARHITEKTURE ZA AI/ML: KADA JE KOJA NAJBOLJA?
U ovom predavanju ćemo se pozabaviti pitanjem sa kojim se suočava većina AI/ML istraživača, o tome koja je računarska arhitektura optimalna za problem koji rešavaju. Da li je to GPU, CPU, FPGA, DSP, ASIC ili nešto sasvim novo? Možemo li AI da primenimo na pametnom telefonu? Pitanja ima mnogo a pravih odgovora malo čak i na Guglu.

SREDA, 28.04.2021. u 19:00, Live stream
Petar Veličković, DeepMind
GEOMETRIC DEEP LEARNING: GRIDS, GRAPHS, GROUPS, GEODESICS AND GAUGES
The last decade has witnessed an experimental revolution in data science and machine learning, epitomised by deep learning methods. Indeed, many high-dimensional learning tasks previously thought to be beyond reach –such as computer vision, playing Go, or protein folding – are in fact feasible with appropriate computational scale. Remarkably, the essence of deep learning is built from two simple algorithmic principles: first, the notion of representation or feature learning, whereby adapted, often hierarchical, features capture the appropriate notion of regularity for each task, and second, learning by local gradient-descent type methods, typically implemented as backpropagation.
While learning generic functions in high dimensions is a cursed estimation problem, most tasks of interest are not generic, and come with essential pre-defined regularities arising from the underlying low-dimensionality and structure of the physical world. This talk is concerned with exposing these regularities through unified geometric principles that can be applied throughout a wide spectrum of applications.
Such a ‘geometric unification’ endeavour in the spirit of Felix Klein's Erlangen Program serves a dual purpose: on one hand, it provides a common mathematical framework to study the most successful neural network architectures, such as CNNs, RNNs, GNNs, and Transformers. On the other hand, it gives a constructive procedure to incorporate prior physical knowledge into neural architectures and provide principled way to build future architectures yet to be invented.

Ovaj onlajn seminar nastao je kao nastavak sastanka “Serbian AI Meeting” i zamišljen je da na njemu istraživači iz Srbije i iz dijaspore, kao i istraživači sa univerzteta, naučnih instituta i iz prakse predstavljaju naučne teme i rezultate iz oblasti veštačke inteligencije.
Link za svako pojedinačno predavanje biće dostavljen dan pre održavanja predavanja.

Andreja Tepavčević
Rukovodilac seminara