At deepomatic, we believe artificial intelligence is the way to unlock the world of tomorrow. We believe these technologies should be made accessible to all instead of being the privilege of a few big technology companies. We are developing a web platform used by Global 500 companies like Airbus, Valeo or Compass group to solve problems ranging from detecting cancers; making motorways smarter; to autonomous cars. We need talented and creative engineers to help us make that new world a reality.
Your main goal will be to design and build the architecture that will allow our platform to tackle even more ambitious problems. Here are a few examples of the challenges you might be confronted to:
What's the best way to setup a data processing pipeline ?
How to design a flexible yet "type checked" data processing pipeline ?
How to deploy deep-learning algorithms on a fleet of embedded systems at scale ?
How to setup a monitoring system to evaluate to performance of deployed algorithm and make sure their accuracy does not drop for some reason ?
Regarding our stack and tools, we work with distributed architectures and continuous delivery process. The diagram below will give you an idea of our entire stack. For this role, we are looking for people who would be able to proficiently work with the orange tools but who also have the will and curiosity to try even newer technologies.
Contribute to our deep learning platform by adding exciting new capabilities to it.
Build modern & robust ways to manage execution flow in a distributed environment based on micro-services.
Handle the scalability of our platform by improving the efficiency of systems.
Build exciting technological demos based on our internal tools or brand new stacks.
3+ years of work experience
Experience with Python is required
Experience with C++ is a plus
Experience with tools like Django, Celery, RabbitMq, Redis and/or PostgreSQL will be greatly appreciated
Great human qualities and a love for team work
Great oral and written communication in English
(optional) Experience with Hadoop or Spark
(optional) Experience with devops tools like Kubernetes