Leading the Bolt teams in Bucharest. We're working on cool products like food delivery, and interesting and challenging platforms like geo, A/B testing, user accounts and route tracking. We're using a modern stack, building real-time systems and handling large amounts of data. If that sounds like your cup of tea, we're hiring and growing! DM me or hit careers.bolt.eu for opportunities.
November 2020 to $present in Bucharest, Romania
Leading the Bolt platform and developer infrastructure groups. We're working on cool products like maps, data foundations, microservices tooling, etc. We're using a modern stack, building real-time systems and handling large amounts of data. If that sounds like your cup of tea, we're hiring and growing! DM me or hit careers.bolt.eu for opportunities.
November 2019 to October 2020 in Bucharest, Romania
Leading the teams in Bucharest. We're working on cool products like food delivery, and interesting and challenging platforms like geo, A/B testing, user accounts and route tracking. We're using a modern stack, building real-time systems and handling large amounts of data.
March 2019 to November 2019 in Bucharest, Romania
Leading the geo, A/B testing, user accounts and route tracking teams at Bolt. We're using a modern stack, building real-time systems and handling large amounts of data.
January 2019 to March 2019 in Bucharest, Romania
Leading the geo team at Bolt. Dealing with maps, routing, and all things geo. The team handles the full stack from serving and experiments and all the APIs we provide to internal clients, to Hadoop data processing and ML model integration.
April 2018 to January 2019 in Bucharest, Romania
Learning the ropes of a new industry. Mostly dealing with the geo infrastructure at Bolt - maps, routing, geocoding etc. Dealing with the full-stack - serving infra to internal clients, Hadoop jobs for analytics and ML etc.
May 2016 - March 2018 in Bucharest, Romania
I worked on the ad systems bits. This means the job ads you see on most any page, but also the image display ads. Work is a mix between building low latency serving systems, high throughput data crunching pipelines and plain-old CRUD applications.
May 2013 - March 2016 in London, UK
I worked on the AdSense product. The core work involved building streaming logs analysis pipelines which produce revenue optimization recommendations for clients. I made use of common Google infrastructure, but tools I've interacted with more closely are ProtocolBuffers/gRPC, Borg, BigTable, MapReduce/Flume and MillWheel. For the last three quarters I have worked on the
Optimization Tab as one of the original three engineers in a group which later grew to six contributors. This was a large greenfield project which aimed to provide a single place for customers to find out and experiment with ways to improve their revenue. I was the owner for one of the two sections, which included building a front-end application, a service backend, as well as coordinating the work of other engineers as the project increased in scope
October 2012 - January 2013 in Bucharest, Romania
I developed algorithms for gesture recognition in an embedded setting (phones, TVs etc.). The work was equal parts computer vision and machine learning research (literature study, proposal for new methods if it was the case) and development (getting things to work in real-time with limited resources). The prototype was done in MATLAB under Windows, while the production code was done in C, in an embedded setting.
October 2008 - September 2010 in Bucharest, Romania
I worked on software for wireless network switches. More precisely, I was on the team responsible for a network simulator for use in development and for testing purposes. I also did minor work on parts of the actual switch control panel software. The work was mostly done using C and Tcl under Linux and BSD.
Bazel macros for building Python packages and interacting with PyPi.
I am the author and main contributor to this project.
Python library for image hashing and deduplication.
One interesting algorithmic hurdle in ZigZag was figuring out which images were content-level duplicates of each other, even though the bitmap representation was different (different sizes, compression artifacts etc.), for which I built a Python package called SDHash.
Unit testing module for table-like test, for Python 3.
I am the author and main contributor.
Experiments in deep neural networks and sparse coding.
This project was part of the requirements for receiving a MSc after finishing my masters studies. I built an ML system for classification/regression centered around Deep Neural Networks and Sparse Coding methods. Thanks to an original contribution, it managed to beat the best known results in a number of benchmark problems. I've also used the system in an image text detection problem, as a showcase of a more complex application. One research paper (
Sparse Coding Neural Gas Applied to Image Recognition in
WSOM 2012: The 9th Workshop on Self-Organizing Maps) has been produced as a result of this work. The project was about 10000 lines of MATLAB and 3000 lines of C for the numerical computation.
Verilog designs for a minimalist CPU and assorted I/O devices. Also includes an assembler and a system builder, written in Haskell.
This project was part of the requirements for receiving a BSc after finishing my undergraduate studies. I designed and implemented a minimalist microprocessor, a set of assorted I/O devices as well as a system builder and assembler. I used a Digilent Spartan 3e FPGA development board to test and showcase my designs. The project was about 10000 lines of Verilog, 1500 lines of Haskell, 600 lines of Makefiles for the build system and 500 lines of processor specific assembler. Overall ~13000 lines of code in the system core, several tools and 5 example applications.
The smart funny images browser.
A small image aggregation Android/iOS app. I built the server side component and the Android app, while another developer built the iOS app. I did this project as an exercise in building a multi-platform internet application with a service based architecture. I used Python with Django for the API server, Thrift and its Python RPC system for various services, PIL for image handling, vanilla Android for the Android application and Chef for configuration management.
WSOM 2012: The 9th Workshop on Self-Organizing Maps.
The work belongs to the field of Deep Learning. It studies one method of learning a dictionary, called Sparse Coding Neural Gas, and applies it, together with classical techniques from the field, to a problem of image recognition. The article was written as part of the work for my Masters Thesis and has appeared in
Advances In Self-Organizing Maps, AISC 198.
October 2010 - July 2012
October 2006 - July 2010