Page Not Found
Page not found. Your pixels are in another canvas.
A list of all the posts and pages found on the site. For you robots out there is an XML version available for digesting as well.
Page not found. Your pixels are in another canvas.
About me
This is a page not in th emain menu
Published:
(authored by Marin Vlastelica)
Published:
(authored by Marin Vlastelica)
Published in Ars Combinatoria 2018
Superconcentrators are graphs that have good connectivity without having too many edges. People like them more when they have fewer edges. We found a construction that makes them a tiny bit smaller than before.
Published in ISAAC 2015
A strange paper in retrospect. Roughly speaking, it says on which classes of graphs (relational structures) one should not expect combinatorial optimization problems to become computationally easier.
Published in FOCS 2015, SIAM Journal on Computing 2016
Somewhat heavy theory. A strong result that concludes complexity classification of a large class of discrete optimization problems.
Published in SIAM Journal on Imaging Sciences 2016
Classical approach to image restoration reduces to efficiently solving combinatorial problems on tree structures. We solved them very efficiently.
Published in ASIACCS 2018
Side project in cryptography. Some proposed password hashing functions are not as safe as it seems. The attacks are related to graph pebbling, cool stuff!
Published in SODA 2017, journal version in ACM Transactions on Algorithms, 2018
Matching on general graphs can be generalized in terms of matroids. This has connections to a planar version of constraint satisfaction problems. We developed a nontrivial generalization of the Blossom algorithm that solves a large class of such problems.
Published in CVPR 2018 (oral), HM best paper award
Direct optimization of rank-based metrics can be done via structured SVM framework. But then there is a nasty algorithmical subproblem to deal with. We dealt with it in optimal runtime.
Published in NIPS 2018
Is Adam really the best we can do? An simple enough update rule can dramatically outperform Adam on some datasets. The optimizer turned out not to be very robust but it had its moments such as actually driving the training loss on MNIST to 0.0 in 20 epochs.
Published in CVPR 2019
Why variational autoencoders disentangle? Turns out it is due to innocent looking choice of diagonal posterior. Theoretical paper about an accidental connection between VAEs and (nonlinear) principle components (PCA).
Published in ICLR 2020 (spotlight)
Embed blackbox combinatorial solvers into neural networks without any sacrifices! How to turn c++ code for solving e.g. travelling salesman into a differentiable NN building block. Mostly theoretical work with update rule resembling classical SVM-based methods but revamped to make good theoretical sense for deep learning. The start of #blackboxbackprop.
Published in CVPR 2020 (oral) nomination for best paper award
#blackboxbackprop allows to differentiate through the ranking function out of the box. This, along with a few other tricks, allows for efficient optimization of rank-based metrics. With a minimal implementation overhead, we obtain competitive results on metric learning benchmarks and on object detection.
Published in ECCV 2020
Using #blackboxbackprop, we wrap strong graph matching solvers into neural network building blocks. With additional help of a few architectural tricks, we obtain SOTA on deep graph matching benchmarks (e.g. PASCAL VOC).
Published in Arxiv 2020
Cross-Entropy method is a powerful optimizer for model-based RL. But it is too slow for real time deployment. We introduce several adjustments and tricks to bridge the “real time” gap. The resulting iCEM is FAAAAST!