Libertas Funding

Data Science and Machine Learning Manager

Oct. 24, 2022

Anywhere

POSITION SUMMARY: The Data Lead will own Libertas’s data and machine learning initiatives, from early-stage ideation, analysis, and modeling development, as well as guide necessary infrastructure planning and development. The goal of the Data Lead is to develop a strategic roadmap that will enable Libertas to make data driven underwriting and business decisions. This position is ideal for a technologist and leader who is excited to be hands on, while building a team and strategy for data science at Libertas. ESSENTIAL FUNCTIONS AND RESPONSIBILITIES (Duties which are critical/central to accomplishing the purpose of the job. List job duties in order of importance.): Own the data and data science roadmap at Libertas, including vision, strategy, milestone scoping and prioritization Collaborate with and influence cross functional stakeholders so that the team’s work has a meaningful and direct impact on company outcomes Initially serve as technical lead on projects, from scoping and prioritizing data related problems to ensuring timeliness and quality of deliverables to setting the standard for code quality and data science best practices Drive the evolution of data products and platform, with a focus on serving analytics and data science pipelines Identify and scope opportunities to support business needs through intuitive and efficient data architecture and services, including identification of new approaches and tools Build and manage a team of data engineers, analysts, and data scientists to scale Libertas’ data capabilities Participate in the design of data and analytic infrastructure Partner with cross functional stakeholders across the organization to scope and prioritize data related problems Guide the team in using existing frameworks and architecture, including cloud-based infrastructure, vendor API, SaaS, and CI/CD.  Adapts these for improved effectiveness, changing business needs, maintainability, and flexibility. Adhere to compliance procedures and internal/operational risk controls in accordance with all applicable regulatory standards, requirements, and policies. KNOWLEDGE, SKILLS and ABILITIES (Required for this job.): An established track record of building and leading high performing data and data science teams, including establishing hiring and retention best practices, effective stakeholder management, and creating systems for project success Must be a strong technical leader as well as an independent, critical, and analytical thinker who can rapidly adapt to changing business and organizational demands Ability to communicate complex data and modeling results in a simple, actionable way to diverse stakeholders Excellent interpersonal and relationship-building skills Track record of successfully building and deploying production level data modeling workflows/pipelines, for consumption by applications and internal customers Deep knowledge of data governance best practices, including data quality/integrity and privacy Fluency in multiple relevant programming languages and tools, e.g. SQL, Python, AirFlow, etc Experience DS project lifecycle, from ideation through deployment. Hands on experience with models like XGBoost and multi-class classifiers. Knowledge of complex modeling workflows, including multi-modal pipelines a strong plus. Experience with financial data is strong plus Experience with visualization tools and best practices, Tableau specific experience is a strong plus EDUCATION/EXPERIENCE REQUIREMENTS (minimum education and/or years of experience required to perform the job.) ☒ High school diploma or equivalent work experience.              ☒ College degree or equivalent work experience. Required Years of Work Experience: 7+ years of experience PREFERRED KNOWLEDGE SKILLS AND EXPERIENCE FOR THIS JOB: Preferred Education/Experience: Bachelor’s degree or higher in Computer Science or related STEM field Experience with machine learning and AI Experience with data management tools Experience with Tableau and other visualization tools Knowledge of SQL, Python, Airflow, etc. Knowledge of AWS PaaS services including Redshift, Glue, EFS, S3 Knowledge of the Agile process and Test-Driven Development Experience in startup companies

Built with ❤️ for the ML Community by Dom © 2022 RemoteML