Our Company:
If you are looking to join a team where your opinion is valued, your contributions are noticed, and enjoy working with fun and talented people from all over the world then this is the place for you.
If you have a desire to work in an organisation that is:
- Passionate about its people
- Focused on delivering the very best tech to our customers
- Offering the flexibility to work how and where you are most successful
- Obsessed with our customer’s success
- The leading SaaS platform to automate partnerships - affiliate, influencer, technology partners, and more!
- Entrepreneurial in spirit with a culture that rewards collaboration and curiosity
- Obsessed with making a difference in business and to the wider community
Impact is the global leader in Partnership Automation, working with innovative brands like Ticketmaster, Levi’s, Microsoft, Airbnb, and Uber to help them manage their online affiliate, influencer, brand to brand, and content partnerships. The Impact Partnership Cloud covers the full life partnership lifecycle including onboarding, tracking ads and paying partners, recruiting for new partners, data and marketing intelligence, and protection from fraud. Founded in 2008, Impact has grown to over 500 employees and ten offices across Europe, the United States, Africa, and Asia so there is plenty of opportunity for growth and advancement.
Your Role at Impact:
The Senior Analytics Engineer is a technical data professional; able to manage, process and analyse large datasets using big data technologies such as Apache Spark, SingleStore and BigQuery as well as being able to visualise and report on these datasets. The ideal candidate will be proficient in designing and implementing efficient data workflows to move, transform, aggregate and enrich data from various sources into a centralised data warehouse and purpose-built data marts, ensuring internal code management and data quality standards are adhered to, in addition to providing users access to standard reports, rich visualisations and other analytical data assets.
The position requires a strong analytical mindset, attention to detail, programming skills and experience with big data technologies. This is a highly collaborative role as the engineer needs to engage with Subject Matter Experts to implement business logic, understand source data structures and ensure data outputs are accurate, fit-for-purpose, pass quality assurance and provide value to the business.
What You'll Do:
- Design, develop and maintain data models, data marts and analytical data stores
- Work closely with Subject Matter Experts (SMEs), Business and Technical stakeholders to define and document business logic and transformation rules to be used in data load jobs and (materialised) analytical views
- Build and maintain data load and transformation jobs to populate data lakes, data marts and data warehouses following the Extract-Load-Transform (ELT) and Extract-Transform-Load (ETL) paradigms as appropriate
- Create and maintain reusable data assets ready for consumption by machine learning models, data visualisation tools and data analysts
- Create and maintain entity-relationship diagrams (ERDs), data dictionaries and data flow diagrams
- Create and maintain table and column metadata
- Manage code releases, deployment cycles and the associated change management processes
- Build and maintain standard reports for internal stakeholders
- Contribute to the development and expansion of common utility libraries used by data teams
- Maintain high standards of quality, integrity and accuracy in produced data assets
- Troubleshoot and resolve any issues that arise relating to data assets in the production environment in a timely manner
- Optimise total system performance related to ETL/ELT workloads and analytical queries, ensuring efficient use of compute resources and stability of data systems
- Optimise code related to ELT/ETL workloads for simplicity, reusability and efficiency and in line with best practice
- Conduct periodic integrity checks on productionalized data assets
- Safeguard sensitive company data
- Work with the data Quality Assurance (QA) function to extend and enhance programmatic validation of productionalized data assets
- Stay up-to-date with the latest big data technologies and best practices
- Automate manual data load, data transformation and data management processes
- Review and Sign off code changes
- Mentor and train junior colleagues
- Actively participate in the hiring process and performance management of team members
What You Have:
- Bachelor's or Master's degree in Computer Science, Data Science or related field
- 6+ years of experience in data pipeline development and data warehousing using big data technologies such as Apache Spark, Google DataFlow, SingleStore, Impala, Kudu and/or BigQuery
- Proven track record in developing enterprise-level data marts
- Experience with Databricks advantageous
- Experience with dbt advantageous
- Experience with Google Cloud Platform and BigQuery advantageous
- Strong SQL development experience required
- Strong Python programming skills required
- Strong knowledge of relational database management systems
- Strong data modelling and schema design experience
- Experience with workflow management tools such as Airflow, Luigi or Oozie advantageous
- Knowledge of data integration patterns, data load patterns and best practices required
- Knowledge of software development best practices and version control tools
- Strong analytical and problem-solving skills
- Strong written and verbal communication skills
- Good leadership and workload management skills and experience advantageous
- Ability to work in a team environment and collaborate with internal stakeholders
Benefits/Perks:
Unlimited PTO policy
Take the time off that you need. We are truly committed to a positive work-life balance, recognising that it is important to be happy and fulfilled in both
Training & Development
Learning the advanced partnership automation products
Medical Aid and Provident Fund
Group schemes with Discovery & Bonitas for medical aid
Group scheme with Momentum for provident fund
Stock Options
4-year vesting schedule pending Board approval
Internet Allowance
Flexible work hours
Casual work environment
_________
All employees and applicants for employment shall be given fair treatment and equal employment opportunity regardless of their race, ethnicity or ancestry, color or caste, religion or belief, age, sex (including gender identity, gender reassignment, sexual orientation, pregnancy/maternity), national origin, weight, neurodivergence, disability, marital and civil partnership status, caregiving status, veteran status, genetic information, political affiliation, or other prohibited non-merit factors.
#LI-CT1