Based in Switzerland, 42matters offers a comprehensive suite of products and services dedicated to App Intelligence and Analytics. We harness a powerful blend of technical prowess and business acumen to deliver in-depth analyses of the latest trends in the mobile app and CTV industries. Our client portfolio boasts some of the world's leading mobile companies, whom we empower with data-driven insights to foster growth and innovation.
Join Us as a Data Engineer:
If you're an individual fueled by a passion for handling large data sets, and excited about the opportunity of exploring mobile app data, we have a place for you. Your primary responsibilities will include:
- Designing, constructing, and maintaining data pipelines to augment our product range, both existing and forthcoming.
- Engaging in the full spectrum of our tech stack, from DevOps upkeep and optimization to joint endeavors with fellow engineers to enhance and sustain our applications and services.
The Ideal Candidate has:
- An in-depth, hands-on experience with data engineering.
- Proficiency in Python, with familiarity in tools and databases central to our operations.
- The ability to collaborate effectively with a team, yet take the initiative and function autonomously when required.
Our culture celebrates innovation, collaboration, and personal growth. At 42matters you're a pivotal part of a team that's shaping the future of mobile app analytics. We provide our team with growth opportunities, a balanced work environment, and a range of perks. Dive deeper into what makes us unique: 42matters Company Culture.
Job type: Full-time
Starting date: As soon as possible
- Write data pipelines to extract, transform and load (ETL) data automatically, using a variety of traditional as well as large-scale distributed technologies.
- Extend and optimize the current services and applications by making large amounts of data accessible for both our data scientists and our customers (via our services/products).
- Write and maintain code that verifies data is in top quality.
- Help build a reliable, sustainable and scalable data infrastructure.
- 3+ years of work experience with automated data collection and cleaning.
- Good experience with Amazon Web Services (e.g. ECS, RDS, S3, Elasticsearch).
- Good knowledge of Python.
- Good knowledge of MongoDB.
- Experience with Docker.
- Familiar with relational data stores (e.g. PostgreSQL, Redshift).
- Good abilities in DevOps.
- Fluent English.
- Self-motivated, team player comfortable in a small, intense and high-growth start-up environment.
- Strong educational background: Bachelor/Master degree in Computer Science or other technical/science/math fields.
- Proficiency in other programming languages (e.g. Bash, Java).
- Experience with crawling libraries (e.g. selenium, scrapy, beautifulsoup)
- Ability to identify and resolve performance issues.