The company offers the most advanced insights on human mobility based on cutting edge data science, proprietary machine learning algorithms and deep technology, capturing billions of signals every day from cell towers and other unique sources. It works with leading telecom companies and data partners around the globe to capture information about people’s geographical locations, movement habits and demographics; all completely anonymized and aggregated. They believe in unlocking the power of data to change the cities we live in, solve real-world business problems, enhance people’s lives, and ultimately benefit the world.
You will be working with a team of Solution Architects, Software Engineers and Data Scientists, developing solutions providing mobility insights to large transportation operators and transport consultancies. Your work will help their customers improve how millions of people travel every day.
The solutions will leverage state-of-the-art technologies including (but not limited to) Apache Mesos/Marathon, Kubernetes, Ceph, Apache Spark and Flink.
Driving the software architecture, design, development, testing, deployment, and running of our data analytics platform that currently processes more than 100 billion events every day
Developing Scala, Flink and Spark pipelines and APIs, operating on terabytes of location-based data across several geographies
Designing and developing both bare-metal and cloud-based solutions, leveraging state-of-the-art technologies, tools and methodologies
Working together with a multi-disciplinary team of Software Engineers and Data Scientists
What to expect: They are an equal opportunity employer and value diversity. They do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Bachelor's degree or equivalent practical experience.
Coding experience in Scala or Python.
Good written and spoken English language.
Proactive, demonstrate passion for the product/topic, technologies, ability to thrive under pressure, open to learning and feedback.
Experience with big data technologies such as Apache Spark.
Experience working with distributed technologies and frameworks.
Experience in processing spatial or mobility data at scale.
Experience with container technologies (Docker, Kubernetes).
Experience with workflow management (Airflow).
Experience with automation and workflow schedulers (e.g., Apache Airflow).
Be part of an exciting and ambitious start-up that puts its people at the heart of its business.
Be part of a diverse, international, cross-disciplinary team of highly motivated, hands-on experts that tackle unique challenges with a positive spirit and lots of fun.
Flexible work schedule.
Additional company holidays.
Career development programs.