Jumia is the leading pan-Africa e-commerce platform. Founded in 2012, Jumia’s mission is to improve the quality of everyday life in Africa by leveraging technology to deliver innovative, convenient and affordable online services to consumers, while helping businesses grow as they use our platform to reach and serve consumers.
Our platforms consist of our marketplace, which connects sellers with consumers, our logistics service, which enables the shipment and delivery of packages from sellers to consumers, and our payment service, which facilitates transactions among participants active on our platform in selected markets. Through our online platforms, consumers can access a wide range of physical and digital goods and services, fashion, electronics, beauty products and also hotel and flight bookings or restaurant delivery.
With over 3,000 employees in 14 countries spanning across 6 African regions, Jumia is led by top talented leaders offering a great mix of local and international talents and is backed by very high-profile shareholders. Jumia is committed to creating sustainable impact for Africa. Jumia offers unique opportunities in a vibrant and booming environment, creating new jobs, new skills, and empowering a new generation. We are looking for talented people with a passion for Africa to join our team and embark on our exciting journey!
- Contribute in the design, development and maintenance of the company analytics platforms, including databases, large-scale processing systems and data visualization;
- Support data scientists in building scalable pipelines (e.g. recommender systems, image recognition, product sorting) using our Big Data platform;
- Collect, store, process, and support the analysis of huge sets of data, both structured and unstructured.
- Knowledge of the Linux operation system (OS, networking, process level);
- Understanding of one or more object-oriented programming languages (Java, C++, C#, Python);
- Fluent in at least one scripting language (Shell, Python, Ruby, etc.);
- Understanding of Big Data technologies (Hadoop, Hbase, Spark, Kafka, Flume, Hive, etc);
- Knowledge of building complex data processing pipelines using continuous integration tools
- Demonstrable skills and experience using SQL with large data sets (for example, SQL Server, Oracle,DB2);
- Experience in building ETL processes (SSIS, Oracle Data Integrator; Code based);
- Experience in modelling databases for analytical consumption (Star Schema; Snowflake; Data Vault);
- Knowledge of NoSQL databases and big data databases (Google BigQuery, Cassandra, MongoDB);
- Experience in OLAP tools (SSAS, IBM Cognos) with multidimensional and/or tabular models;
- Experience with high throughput, 24x7 systems;
- Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations;
- Strong analytic skills;
- Experience supporting and working with cross-functional teams in a dynamic environment.
- Nice to have: Experience with data visualization tools (Qlikview; Tableau; PowerBI);
- Nice to have: Experience in designing big data/distributed systems;
- Nice to have: Experience creating and driving large scale ETL pipelines.
- A unique experience in an entrepreneurial, yet structured environment
- The opportunity to become part of a highly professional and dynamic team working around the world
- An unparalleled personal and professional growth as our longer-term objective is to train the next generation of leaders for our future internet ventures
Please send your CV in English. CV in other languages will not be considered